COMMENTS ON AN ACCELERATED LEARNING ALGORITHM FOR MULTILAYER PERCEPTRONS - OPTIMIZATION LAYER-BY-LAYER

Citation
Bp. Vanmilligen et al., COMMENTS ON AN ACCELERATED LEARNING ALGORITHM FOR MULTILAYER PERCEPTRONS - OPTIMIZATION LAYER-BY-LAYER, IEEE transactions on neural networks, 9(2), 1998, pp. 339-341
Citations number
6
Categorie Soggetti
Computer Science Artificial Intelligence","Computer Science Hardware & Architecture","Computer Science Theory & Methods","Computer Science Artificial Intelligence","Computer Science Hardware & Architecture","Computer Science Theory & Methods
ISSN journal
10459227
Volume
9
Issue
2
Year of publication
1998
Pages
339 - 341
Database
ISI
SICI code
1045-9227(1998)9:2<339:COAALA>2.0.ZU;2-E
Abstract
This letter analyzes the performance of the neural network training me thod known as optimization layer by layer.(1) We show, from theoretica l considerations, that the amount of work required with OLL-Learning s cales as the third power of the network size, compared with the square of the network size for commonly used conjugate gradient training alg orithms. This theoretical estimate is confirmed through a practical ex ample. Thus, although OLL is shown to function very well for small neu ral networks (less than about 500 weights per layer), it is slower tha n CG for large neural networks. Second, we show that OLL does not alwa ys improve on the accuracy that can be obtained with CG. It seems that the final accuracy that can be obtained depends strongly on the initi al network weights.