Manifold Learning Method for Large Scale Dataset Based on Gradient Descent
- DOI
- 10.2991/icmt-13.2013.145How to use a DOI?
- Keywords
- manifold learning; LLE; gradient descent; time complexity.
- Abstract
Dimension reduction is a research hotspot in recent years, especially manifold learning for high-dimensional data. Cause high-dimensional data have complex nonlinear structures there are many researchers focus on nonlinear methods. The memory cost and running time are too large and difficult to operate when the scales of data are tremendously large. In order to solve this problem, we utilized the gradient descent to search the low-dimensional embedding. It replaced the eigenvalue decomposition of a large sparse matrix of LLE (Linear Locally Embedding) algorithm. The time complexity is lower than before and storage memory is declined obviously. Experimental results demonstrated our approach performed well than the original algorithm. Furthermore, our approach can be applied to other manifold method or other research fields such as information retrieval and feature extraction.
- Copyright
- © 2013, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - Wang Yunhe AU - Gao Yuan AU - Xu Chao PY - 2013/11 DA - 2013/11 TI - Manifold Learning Method for Large Scale Dataset Based on Gradient Descent BT - Proceedings of 3rd International Conference on Multimedia Technology(ICMT-13) PB - Atlantis Press SP - 1180 EP - 1187 SN - 1951-6851 UR - https://doi.org/10.2991/icmt-13.2013.145 DO - 10.2991/icmt-13.2013.145 ID - Yunhe2013/11 ER -