EFFECTIVE BACKPROPAGATION TRAINING WITH VARIABLE STEPSIZE

Citation
Gd. Magoulas et al., EFFECTIVE BACKPROPAGATION TRAINING WITH VARIABLE STEPSIZE, Neural networks, 10(1), 1997, pp. 69-82
Citations number
34
Categorie Soggetti
Mathematical Methods, Biology & Medicine","Computer Sciences, Special Topics","Computer Science Artificial Intelligence",Neurosciences,"Physics, Applied
Journal title
ISSN journal
08936080
Volume
10
Issue
1
Year of publication
1997
Pages
69 - 82
Database
ISI
SICI code
0893-6080(1997)10:1<69:EBTWVS>2.0.ZU;2-H
Abstract
The issue of variable stepsize in the backpropagation training algorit hm has been widely investigated and several techniques employing heuri stic factors have been suggested to improve training time and reduce c onvergence to local minima. In this contribution, backpropagation trai ning is based on a modified steepest descent method which allows varia ble stepsize. It is computationally efficient and possesses interestin g convergence properties utilizing estimates of the Lipschitz constant without any additional computational cost. The algorithm has been imp lemented and tested on several problems and the results have been very satisfactory. Numerical evidence shows that the method is robust with good average performance on many classes of problems. Copyright (C) 1 996 Elsevier Science Ltd.