UNSUPERVISED LEARNING FOR BOLTZMANN MACHINES

Authors
Citation
G. Deco et L. Parra, UNSUPERVISED LEARNING FOR BOLTZMANN MACHINES, Network, 6(3), 1995, pp. 437-448
Citations number
17
Categorie Soggetti
Mathematical Methods, Biology & Medicine",Neurosciences,"Engineering, Eletrical & Electronic","Computer Science Artificial Intelligence
Journal title
ISSN journal
0954898X
Volume
6
Issue
3
Year of publication
1995
Pages
437 - 448
Database
ISI
SICI code
0954-898X(1995)6:3<437:ULFBM>2.0.ZU;2-L
Abstract
An unsupervised learning algorithm for a stochastic recurrent neural n etwork based on the Boltzmann Machine architecture is formulated in th is paper. The maximization of the mutual information between the stoch astic output neurons and the clamped inputs is used as an unsupervised criterion for training the network. The resulting learning rule conta ins two terms corresponding to Hebbian and anti-Hebbian teaming. It is interesting that these two terms are weighted by the amount of inform ation transmitted in the learning synapse, giving an information-theor etic interpretation of the proportionality constant of Hebb's biologic al rule. The anti-Hebbian term, which can be interpreted as a forgetti ng function, supports the optimal coding. In this way, optimal nonline ar and recurrent implementations of data compression of Boolean patter ns are obtained. As an example, the encoder problem is simulated and t rained in an unsupervised way in a one-layer network. Compression of n on-uniform distributed binary data is included. Unsupervised classific ation, even for continuous inputs, is shown for the cases of four over lapping Gaussian spots and for a real-world example of thyroid diagnos is. In comparison with other techniques, the present model requires an exponentially smaller number of weights for the classification proble m.