Accelerating neural network training using weight extrapolations

Citation
Sv. Kamarthi et S. Pittner, Accelerating neural network training using weight extrapolations, NEURAL NETW, 12(9), 1999, pp. 1285-1299
Citations number
50
Categorie Soggetti
AI Robotics and Automatic Control
Journal title
NEURAL NETWORKS
ISSN journal
08936080 → ACNP
Volume
12
Issue
9
Year of publication
1999
Pages
1285 - 1299
Database
ISI
SICI code
0893-6080(199911)12:9<1285:ANNTUW>2.0.ZU;2-K
Abstract
The backpropagation (BP) algorithm for training feedforward neural networks has proven robust even for difficult problems. However, its high performan ce results are attained at the expense of a long training time to adjust th e network parameters, which can be discouraging in many real-world applicat ions. Even on relatively simple problems, standard BP often requires a leng thy training process in which the complete set of training examples is proc essed hundreds or thousands of times. In this paper, a universal accelerati on technique for the BP algorithm based on extrapolation of each individual interconnection weight is presented. This extrapolation procedure is easy to implement and is activated only a few times in between iterations of the conventional BP algorithm. This procedure, unlike earlier acceleration pro cedures, minimally alters the computational structure of the BP algorithm. The viability of this new approach is demonstrated on three examples. The r esults suggest that it leads to significant savings in computation time of the standard BP algorithm. Moreover, the solution computed by the proposed approach is always located in close proximity to the one obtained by the co nventional BP procedure. Hence, the proposed method provides a real acceler ation of the BP algorithm without degrading the usefulness of its solutions . The performance of the new method is also compared with that of the conju gate gradient algorithm, which is an improved and faster version of the BP algorithm. (C) 1999 Elsevier Science Ltd. All rights reserved.