AN ACCELERATED LEARNING ALGORITHM FOR MULTILAYER PERCEPTRON NETWORKS

Citation
Ag. Parlos et al., AN ACCELERATED LEARNING ALGORITHM FOR MULTILAYER PERCEPTRON NETWORKS, IEEE transactions on neural networks, 5(3), 1994, pp. 493-497
Citations number
4
Categorie Soggetti
Computer Application, Chemistry & Engineering","Engineering, Eletrical & Electronic","Computer Science Artificial Intelligence","Computer Science Hardware & Architecture","Computer Science Theory & Methods
ISSN journal
10459227
Volume
5
Issue
3
Year of publication
1994
Pages
493 - 497
Database
ISI
SICI code
1045-9227(1994)5:3<493:AALAFM>2.0.ZU;2-B
Abstract
An accelerated learning algorithm (ABP-adaptive back propagation1) is proposed for the supervised training of multilayer perceptron networks . The learning algorithm is inspired from the principle of ''forced dy namics'' for the total error functional. The algorithm updates the wei ghts in the direction of steepest descent, but with a learning rate a specific function of the error and of the error gradient norm. This sp ecific form of this function is chosen such as to accelerate convergen ce. Furthermore, ABP introduces no additional ''tuning'' parameters fo und in variants of the backpropagation algorithm. Simulation results i ndicate a superior convergence speed for analog problems only, as comp ared to other competing methods, as well as reduced sensitivity to alg orithm step size parameter variations.