HYBRID LEARNING SCHEMES FOR FAST TRAINING OF FEEDFORWARD NEURAL NETWORKS

Authors
Citation
Nb. Karayiannis, HYBRID LEARNING SCHEMES FOR FAST TRAINING OF FEEDFORWARD NEURAL NETWORKS, Mathematics and computers in simulation, 41(1-2), 1996, pp. 13-28
Citations number
18
Categorie Soggetti
Computer Sciences",Mathematics,"Computer Science Interdisciplinary Applications","Computer Science Software Graphycs Programming
ISSN journal
03784754
Volume
41
Issue
1-2
Year of publication
1996
Pages
13 - 28
Database
ISI
SICI code
0378-4754(1996)41:1-2<13:HLSFFT>2.0.ZU;2-1
Abstract
Fast training of feed-forward neural networks became increasingly impo rtant as the neural network held moves toward maturity. This paper beg ins with a review of various criteria proposed for training feed-forwa rd neural networks, which include the frequently used quadratic error criterion, the relative entropy criterion, and a generalized training criterion. The minimization of these criteria using the gradient desce nt method results in a variety of supervised learning algorithms. The performance of these algorithms in complex training tasks is strongly affected by the initial set of internal representations, which are usu ally formed by a randomly generated set of synaptic weights. The conve rgence of gradient descent based learning algorithms in complex traini ng tasks can be significantly improved by initializing the internal re presentations using an unsupervised learning process based on linear o r nonlinear generalized Hebbian learning rules. The efficiency of the hybrid learning scheme presented in this paper is illustrated through experimental results.