Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/99834
Title: Global convergence of online BP training with dynamic learning rate
Authors: Zhang, Rui
Xu, Zong-Ben
Huang, Guang-Bin
Wang, Dianhui
Keywords: DRNTU::Engineering::Electrical and electronic engineering
Issue Date: 2012
Source: Zhang, R., Xu, Z. B., Huang, G. B., & Wang, D. (2012). Global convergence of online BP training with dynamic learning rate. IEEE transactions on neural networks and learning systems, 23(2), 330-341.
Series/Report no.: IEEE transactions on neural networks and learning systems
Abstract: The online backpropagation (BP) training procedure has been extensively explored in scientific research and engineering applications. One of the main factors affecting the performance of the online BP training is the learning rate. This paper proposes a new dynamic learning rate which is based on the estimate of the minimum error. The global convergence theory of the online BP training procedure with the proposed learning rate is further studied. It is proved that: 1) the error sequence converges to the global minimum error; and 2) the weight sequence converges to a fixed point at which the error function attains its global minimum. The obtained global convergence theory underlies the successful applications of the online BP training procedure. Illustrative examples are provided to support the theoretical analysis.
URI: https://hdl.handle.net/10356/99834
http://hdl.handle.net/10220/13532
ISSN: 2162-237X
DOI: http://dx.doi.org/10.1109/TNNLS.2011.2178315
Rights: © 2012 IEEE
metadata.item.grantfulltext: none
metadata.item.fulltext: No Fulltext
Appears in Collections:EEE Journal Articles

Page view(s)

305
checked on Dec 24, 2019

Google ScholarTM

Check

Altmetric

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.