Multi-kernel Partial Least Squares for Multi-Modal Data Analysis
- DOI
- 10.2991/emcm-16.2017.177How to use a DOI?
- Keywords
- Partial least squares regression; Multi-kernel learning; Multi-modal classification; Multi-modal retrieval; Canonical correlation analysis
- Abstract
In recent years, multi-modal data analysis has enjoyed an increasing attention. Multi-modal data mean the different modal data representing the same semantics. Moreover, many subspace learning methods are proposed to measure the correlation between different modal data. As the most representative subspace learning method, canonical correlation analysis (CCA) and its variants project different modal data into a common space where the Pearson correlation is maximized. Yet CCA often causes information loss when switching the modals, and as a result, the partial least squares regression (PLSR) model is adopted to handle the problem. Subsequently, considering the nonlinearity of data, the kernel partial least squares regression (KPLSR) is proposed. Besides, KPLSR mostly relies on the kernel parameters. Hence, we propose to apply multi-kernel partial least squares regression (MKPLSR) for multi-modal data analysis. To evaluate the proposed approach, massive experiments are carried out. Compared with previous methods, the experimental results on two benchmark datasets composed of images and texts pairs, show the effectiveness of our approach, when applied to multi-modal data retrieval and classification.
- Copyright
- © 2017, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - Ping Wang AU - Hong Zhang PY - 2017/02 DA - 2017/02 TI - Multi-kernel Partial Least Squares for Multi-Modal Data Analysis BT - Proceedings of the 2016 7th International Conference on Education, Management, Computer and Medicine (EMCM 2016) PB - Atlantis Press SN - 2352-538X UR - https://doi.org/10.2991/emcm-16.2017.177 DO - 10.2991/emcm-16.2017.177 ID - Wang2017/02 ER -