M. Dimartino et al., EXPLORING AND COMPARING THE BEST DIRECT-METHODS FOR THE EFFICIENT TRAINING OF MLP-NETWORKS, IEEE transactions on neural networks, 7(6), 1996, pp. 1497-1502
It is well known that the main difficulties of the algorithms based on
backpropagation are the susceptibility to local minima and the slow a
daptivity to the patterns during the training. In this paper, we prese
nt a class of algorithms, which overcome the above difficulties by uti
lizing some ''direct'' numerical methods for the computation of the ma
trices of weights. In particular, we investigate the performances of t
he algorithms FBFBK-LSB (the first part named for the authors' initial
s and the second meaning least-squares backpropagation) and iterative
conjugate gradient singular-value decomposition (ICGSVD), respectively
, introduced by Barmann and Biegler-Konig and by the authors. Numerica
l results on Several benchmark problems show a major reliability and/o
r efficiency of our algorithm ICGSVD.