INITIALIZING WEIGHTS OF A MULTILAYER PERCEPTRON NETWORK BY USING THE ORTHOGONAL LEAST-SQUARES ALGORITHM

Citation
M. Lehtokangas et al., INITIALIZING WEIGHTS OF A MULTILAYER PERCEPTRON NETWORK BY USING THE ORTHOGONAL LEAST-SQUARES ALGORITHM, Neural computation, 7(5), 1995, pp. 982-999
Citations number
16
Categorie Soggetti
Computer Sciences","Computer Science Artificial Intelligence",Neurosciences
Journal title
ISSN journal
08997667
Volume
7
Issue
5
Year of publication
1995
Pages
982 - 999
Database
ISI
SICI code
0899-7667(1995)7:5<982:IWOAMP>2.0.ZU;2-M
Abstract
Usually the training of a multilayer perceptron network starts by init ializing the network weights with small random values, and then the we ight adjustment: is carried out by using an iterative gradient descent -based optimization routine called backpropagation training. If the ra ndom initial weights happen to be far from a good solution or they are near a poor local optimum, the training will take a lot of time since many iteration steps are required. Furthermore, it is very possible t hat the network will not converge to an adequate solution at all, On t he other hand, if the initial weights are close to a good solution the training will be much faster and the possibility of obtaining adequat e convergence increases. In this paper a new method for initializing t he weights is presented. The method is based on the orthogonal least s quares algorithm. The simulation results obtained with the proposed in itialization method show a considerable improvement in training compar ed to the randomly initialized networks. In light of practical experim ents, the proposed method has proven to be fast and useful for initial izing the network weights.