EXPLORING AND COMPARING THE BEST DIRECT-METHODS FOR THE EFFICIENT TRAINING OF MLP-NETWORKS

Citation
M. Dimartino et al., EXPLORING AND COMPARING THE BEST DIRECT-METHODS FOR THE EFFICIENT TRAINING OF MLP-NETWORKS, IEEE transactions on neural networks, 7(6), 1996, pp. 1497-1502
Citations number
13
Categorie Soggetti
Computer Application, Chemistry & Engineering","Engineering, Eletrical & Electronic","Computer Science Artificial Intelligence","Computer Science Hardware & Architecture","Computer Science Theory & Methods
ISSN journal
10459227
Volume
7
Issue
6
Year of publication
1996
Pages
1497 - 1502
Database
ISI
SICI code
1045-9227(1996)7:6<1497:EACTBD>2.0.ZU;2-4
Abstract
It is well known that the main difficulties of the algorithms based on backpropagation are the susceptibility to local minima and the slow a daptivity to the patterns during the training. In this paper, we prese nt a class of algorithms, which overcome the above difficulties by uti lizing some ''direct'' numerical methods for the computation of the ma trices of weights. In particular, we investigate the performances of t he algorithms FBFBK-LSB (the first part named for the authors' initial s and the second meaning least-squares backpropagation) and iterative conjugate gradient singular-value decomposition (ICGSVD), respectively , introduced by Barmann and Biegler-Konig and by the authors. Numerica l results on Several benchmark problems show a major reliability and/o r efficiency of our algorithm ICGSVD.