LEARNING ALGORITHMS BASED ON LINEARIZATION

Authors
Citation
R. Hahnloser, LEARNING ALGORITHMS BASED ON LINEARIZATION, Network, 9(3), 1998, pp. 363-380
Citations number
30
Categorie Soggetti
Computer Science Artificial Intelligence",Neurosciences,"Engineering, Eletrical & Electronic","Computer Science Artificial Intelligence
Journal title
ISSN journal
0954898X
Volume
9
Issue
3
Year of publication
1998
Pages
363 - 380
Database
ISI
SICI code
0954-898X(1998)9:3<363:LABOL>2.0.ZU;2-I
Abstract
The aim of this article is to investigate:a mechanical description of learning. A framework for local and simple learning algorithms based o n interpreting a neural network as a set of configuration constraints ia proposed. For any architectural design and learning task, unsupervi sed and supervised algorithms can be derived, optionally using unconst rained and hidden neurons. Unlike algorithms based on the gradient in weight space, the proposed tangential correlation (TC) algorithms move along the gradient in state space. This results in optimal scaling pr operties and simple expressions for the weight updates. The number of synapses is much larger than the number of neurons. A constraint for n eural states does not impose a unique constraint for synaptic weights. Which weights to assign credit to can be selected from a parametrizat ion of all weight changes equivalently satisfying the state constraint s. At the heart of the parametrization are minimal weight changes. Two supervised algorithms (differing by their parametrizations) operating on a three-layer perceptron are compared with standard backpropagatio n. The successful training of fixed points of recurrent networks is de monstrated. The unsupervised learning of oscillations with variable fr equencies is performed on standard and more sophisticated recurrent ne tworks. The results presented here can be useful both for the analysis and for the synthesis of learning algorithms.