An unsupervised learning algorithm for a stochastic recurrent neural n
etwork based on the Boltzmann Machine architecture is formulated in th
is paper. The maximization of the mutual information between the stoch
astic output neurons and the clamped inputs is used as an unsupervised
criterion for training the network. The resulting learning rule conta
ins two terms corresponding to Hebbian and anti-Hebbian teaming. It is
interesting that these two terms are weighted by the amount of inform
ation transmitted in the learning synapse, giving an information-theor
etic interpretation of the proportionality constant of Hebb's biologic
al rule. The anti-Hebbian term, which can be interpreted as a forgetti
ng function, supports the optimal coding. In this way, optimal nonline
ar and recurrent implementations of data compression of Boolean patter
ns are obtained. As an example, the encoder problem is simulated and t
rained in an unsupervised way in a one-layer network. Compression of n
on-uniform distributed binary data is included. Unsupervised classific
ation, even for continuous inputs, is shown for the cases of four over
lapping Gaussian spots and for a real-world example of thyroid diagnos
is. In comparison with other techniques, the present model requires an
exponentially smaller number of weights for the classification proble
m.