INFORMATION-PROCESSING BY A PERCEPTRON IN AN UNSUPERVISED LEARNING-TASK

Authors
Citation
Jp. Nadal et N. Parga, INFORMATION-PROCESSING BY A PERCEPTRON IN AN UNSUPERVISED LEARNING-TASK, Network, 4(3), 1993, pp. 295-312
Citations number
27
Categorie Soggetti
Mathematical Methods, Biology & Medicine",Neurosciences,"Engineering, Eletrical & Electronic",Mathematics,"Computer Applications & Cybernetics
Journal title
ISSN journal
0954898X
Volume
4
Issue
3
Year of publication
1993
Pages
295 - 312
Database
ISI
SICI code
0954-898X(1993)4:3<295:IBAPIA>2.0.ZU;2-0
Abstract
We study the ability of a simple neural network (a perceptron architec ture, no hidden units, binary outputs) to process information in the c ontext of an unsupervised learning task. The network is asked to provi de the best possible neural representation of a given input distributi on, according to some criterion taken from information theory. We comp are various optimization criteria that have been proposed: maximum inf ormation transmission, minimum redundancy and closeness to factorial c ode. We show that for the perceptron one can compute the maximum infor mation that the code (the output neural representation) can convey abo ut the input. We show that one can use statistical mechanics technique s, such as replica techniques, to compute the typical mutual informati on between input and output distributions. More precisely, for a Gauss ian input source with a given correlation matrix, we compute the typic al mutual information when the couplings are chosen randomly. We deter mine the correlations between the synaptic couplings thal maximize the gain of information. We analyse the results in the case of a one-dime nsional receptive field.