The backpropagation (BP) algorithm for training feedforward neural networks
has proven robust even for difficult problems. However, its high performan
ce results are attained at the expense of a long training time to adjust th
e network parameters, which can be discouraging in many real-world applicat
ions. Even on relatively simple problems, standard BP often requires a leng
thy training process in which the complete set of training examples is proc
essed hundreds or thousands of times. In this paper, a universal accelerati
on technique for the BP algorithm based on extrapolation of each individual
interconnection weight is presented. This extrapolation procedure is easy
to implement and is activated only a few times in between iterations of the
conventional BP algorithm. This procedure, unlike earlier acceleration pro
cedures, minimally alters the computational structure of the BP algorithm.
The viability of this new approach is demonstrated on three examples. The r
esults suggest that it leads to significant savings in computation time of
the standard BP algorithm. Moreover, the solution computed by the proposed
approach is always located in close proximity to the one obtained by the co
nventional BP procedure. Hence, the proposed method provides a real acceler
ation of the BP algorithm without degrading the usefulness of its solutions
. The performance of the new method is also compared with that of the conju
gate gradient algorithm, which is an improved and faster version of the BP
algorithm. (C) 1999 Elsevier Science Ltd. All rights reserved.