Leap-frog is a robust algorithm for training neural networks

Citation
Jew. Holm et Ec. Botha, Leap-frog is a robust algorithm for training neural networks, NETWORK-COM, 10(1), 1999, pp. 1-13
Citations number
20
Categorie Soggetti
AI Robotics and Automatic Control
Journal title
NETWORK-COMPUTATION IN NEURAL SYSTEMS
ISSN journal
0954898X → ACNP
Volume
10
Issue
1
Year of publication
1999
Pages
1 - 13
Database
ISI
SICI code
0954-898X(199902)10:1<1:LIARAF>2.0.ZU;2-5
Abstract
Optimization of perceptron neural network classifiers requires an optimizat ion algorithm that is robust. In general, the best network is selected afte r a number of optimization trials. An effective optimization algorithm gene rates good weight-vector solutions in a few optimization trial runs owing t o its inherent ability to escape local minima, where a less effective algor ithm requires a larger number of trial runs. Repetitive training and testin g is a tedious process, so that an effective algorithm is desirable to redu ce training time and increase the quality of the set of available weight-ve ctor solutions. We present leap-frog as a robust optimization algorithm for training neural networks. In this paper the dynamic principles of leap-fro g are described together with experiments to show the ability of leap-frog to generate reliable weight-vector solutions. Performance histograms are us ed to compare leap-frog with a variable-metric method, a conjugate-gradient method with modified restarts, and a constrained-momentum-based algorithm. Results indicate that leap-frog performs better in terms of classification error than the remaining three algorithms on two distinctly different test problems.