We perform a quantitative analysis of information processing in a simp
le neural network model with recurrent inhibition. We postulate that b
oth excitatory and inhibitory synapses continually adapt according to
the following Hebbian-type rules: for excitatory synapses correlated p
re- and post-synaptic activity induces enhanced excitation; for inhibi
tory synapses it induces enhanced inhibition. Following synaptic equil
ibration in unsupervised learning processes, the model is found to per
form a novel type of principal-component analysis which involves filte
ring and decorrelation. In the light of these results we discuss the p
ossible role of the granule-/Golgi-cell subnetwork in the cerebellum.