Nb. Karayiannis, HYBRID LEARNING SCHEMES FOR FAST TRAINING OF FEEDFORWARD NEURAL NETWORKS, Mathematics and computers in simulation, 41(1-2), 1996, pp. 13-28
Fast training of feed-forward neural networks became increasingly impo
rtant as the neural network held moves toward maturity. This paper beg
ins with a review of various criteria proposed for training feed-forwa
rd neural networks, which include the frequently used quadratic error
criterion, the relative entropy criterion, and a generalized training
criterion. The minimization of these criteria using the gradient desce
nt method results in a variety of supervised learning algorithms. The
performance of these algorithms in complex training tasks is strongly
affected by the initial set of internal representations, which are usu
ally formed by a randomly generated set of synaptic weights. The conve
rgence of gradient descent based learning algorithms in complex traini
ng tasks can be significantly improved by initializing the internal re
presentations using an unsupervised learning process based on linear o
r nonlinear generalized Hebbian learning rules. The efficiency of the
hybrid learning scheme presented in this paper is illustrated through
experimental results.