Nb. Karayiannis, ACCELERATING THE TRAINING OF FEEDFORWARD NEURAL NETWORKS USING GENERALIZED HEBBIAN RULES FOR INITIALIZING THE INTERNAL REPRESENTATIONS, IEEE transactions on neural networks, 7(2), 1996, pp. 419-426
This paper presents an unsupervised learning scheme for initializing t
he internal representations of feedforward neural networks, which acce
lerates the convergence of supervised learning algorithms. It is propo
sed in this paper that the initial set of internal representations can
be formed through a bottom-up unsupervised learning process applied b
efore the top-down supervised training algorithm. The synaptic weights
that connect the input of the network with the hidden units can be de
termined through linear or nonlinear variations of a generalized Hebbi
an learning rule, known as Oja's rule. Various generalized Hebbian rul
es were experimentally tested and evaluated in terms of their effect o
n the convergence of the supervised training process. Several experime
nts indicated that the use of the proposed initialization of the inter
nal representations significantly improves the convergence of gradient
-descent-based algorithms used to perform nontrivial training tasks. T
he improvement of the convergence becomes significant as the size and
complexity of the training task increase.