ADVANCED NEURAL-NETWORK TRAINING ALGORITHM WITH REDUCED COMPLEXITY BASED ON JACOBIAN DEFICIENCY

Authors
Citation
G. Zhou et J. Si, ADVANCED NEURAL-NETWORK TRAINING ALGORITHM WITH REDUCED COMPLEXITY BASED ON JACOBIAN DEFICIENCY, IEEE transactions on neural networks, 9(3), 1998, pp. 448-453
Citations number
11
Categorie Soggetti
Computer Science Artificial Intelligence","Computer Science Hardware & Architecture","Computer Science Theory & Methods","Computer Science Artificial Intelligence","Computer Science Hardware & Architecture","Computer Science Theory & Methods","Engineering, Eletrical & Electronic
ISSN journal
10459227
Volume
9
Issue
3
Year of publication
1998
Pages
448 - 453
Database
ISI
SICI code
1045-9227(1998)9:3<448:ANTAWR>2.0.ZU;2-8
Abstract
In this paper we introduce an advanced supervised training method for neural networks. It is based on Jacobian rank deficiency and it is for mulated, in some sense, in the spirit of the Gauss-Newton algorithm. T he Levenberg-Marquardt algorithm, as a modified Gauss-Newton, has been used successfully in solving nonlinear least squares problems includi ng neural-network training. It outperforms (in terms of training accur acy, convergence properties, overall training time, etc.) the basic ba ckpropagation and its variations with variable learning rate significa ntly, however, with higher computation and memory complexities within each iteration. The new method developed in this paper is aiming at im proving convergence properties, while reducing the memory and computat ion complexities in supervised training of neural networks. Extensive simulation results are provided to demonstrate the superior performanc e of the new algorithm over the Levenberg-Marquardt algorithm.