Research on the Solution of BP Neural Network Training Problem
- DOI
- 10.2991/icaita-18.2018.11How to use a DOI?
- Keywords
- neural network; gradient diffusion; activation function; convergence speed; classification accuracy
- Abstract
BP neural network is a common kind of neural network for recognition and classification, but it is obvious that some difficulties are find during the training process. In this paper, we introduced the basic principle of BP algorithm and propose a parameter adjustment procedure for activation function to speed up convergence and avoid gradient diffusion, and the feasibility of this method is proved by a basic BP neural network with simple structure. We analyzed this parameter's available value through a group of experiments based on simple three-layer BP full connected network. In addition, this activation function parameter adjustment procedure is also used in is suitable for Multilayer BP neural networks.
- Copyright
- © 2018, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - Junyang Tan AU - Dan Xia AU - Shiyun Dong AU - Binshi Xu AU - Ye Li PY - 2018/03 DA - 2018/03 TI - Research on the Solution of BP Neural Network Training Problem BT - Proceedings of the 2018 2nd International Conference on Artificial Intelligence: Technologies and Applications (ICAITA 2018) PB - Atlantis Press SP - 41 EP - 45 SN - 1951-6851 UR - https://doi.org/10.2991/icaita-18.2018.11 DO - 10.2991/icaita-18.2018.11 ID - Tan2018/03 ER -