In this paper we present a new algorithm, which is orders of magnitude
faster than the delta rule, for training feed-forward neural networks
. It provides a substantial improvement over the method of Scalero and
Tepedelenlioglu (IEEE Trans. Signal Process. 40(1) (1992)) in both tr
aining time and numerical stability. The method combines the modified
back-propagation algorithm described by Scalero and Tepedelenlioglu al
ong with a faster training scheme and has better numerical stability.
The algorithm is tested against other methods, and results are present
ed. (C) 1997 Pattern Recognition Society.