Neural Network Optimization Algorithm Model Combining L1 / 2 Regularization and Extreme Learning Machine
- DOI
- 10.2991/iwmecs-18.2018.23How to use a DOI?
- Keywords
- ELM, L1/2 Regulation, Neural Network Optimization
- Abstract
Extreme Learning Machine (ELM) is a fast learning algorithm that uses random mechanism to reduce parameter setting and selection, thereby greatly improving learning speed and ensuring generalization ability. Unlike traditional learning methods, ELM The variables are no longer iterative but randomly generated, so that a nonlinear system that expresses a forward neural network can be reduced to a linear system that only needs to compute the output weights, and the least-squares method can be used to solve the linear system directly. Although the ELM algorithm However, the effectiveness of this algorithm in the application of large-scale data needs to be further improved. This thesis is based on L1 / 2 regularization theory and full rank Cholesky matrix decomposition respectively Two Improved ELM Algorithms Applied to Large Scale Data.
- Copyright
- © 2018, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - Anzhi Qi PY - 2018/04 DA - 2018/04 TI - Neural Network Optimization Algorithm Model Combining L1 / 2 Regularization and Extreme Learning Machine BT - Proceedings of the 2018 3rd International Workshop on Materials Engineering and Computer Sciences (IWMECS 2018) PB - Atlantis Press SP - 106 EP - 109 SN - 2352-538X UR - https://doi.org/10.2991/iwmecs-18.2018.23 DO - 10.2991/iwmecs-18.2018.23 ID - Qi2018/04 ER -