Global Learning of Neural Networks by Using Hybrid Optimization Algorithm
- DOI
- 10.2991/iske.2007.201How to use a DOI?
- Keywords
- Neural Networks, Global Learning, Stochastic Approximation, Gradient Descent, Backpropagation Algorithm
- Abstract
This paper proposes a global learning of neural networks by hybrid optimization algorithm. The hybrid algorithm combines a stochastic approximation with a gradient descent. The stochastic approximation is first applied for estimating an approximation point inclined toward a global escaping from a local minimum, and then the backpropagation(BP) algorithm is applied for high-speed convergence as gradient descent. The proposed method has been applied to 8-bit parity check and 6-bit symmetry check problems, respectively. The experimental results show that the proposed method has superior convergence performances to the conventional method that is BP algorithm with randomized initial weights setting.
- Copyright
- © 2007, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - Yong-Hyun Cho AU - Seong-Jun Hong PY - 2007/10 DA - 2007/10 TI - Global Learning of Neural Networks by Using Hybrid Optimization Algorithm BT - Proceedings of the 2007 International Conference on Intelligent Systems and Knowledge Engineering (ISKE 2007) PB - Atlantis Press SP - 1179 EP - 1184 SN - 1951-6851 UR - https://doi.org/10.2991/iske.2007.201 DO - 10.2991/iske.2007.201 ID - Cho2007/10 ER -