The interest in neuronal networks originates for a good part in the option
not to construct, but to train them. The mechanisms governing synaptic modi
fications during such training are assumed to depend on signals locally ava
ilable at the synapses. In contrast, the performance of a network is suitab
ly measured on a global scale. Here we propose a learning rule that address
es this conflict. It is inspired by recent physiological experiments and ex
ploits the interaction of inhibitory input and backpropagating action poten
tials in pyramidal neurons. This mechanism makes information on the global
scale available as a local signal. As a result, several desirable features
can be combined: the learning rule allows fast synaptic modifications appro
aching one-shot learning. Nevertheless, it leads to stable representations
during ongoing learning. Furthermore, the response properties of the neuron
s are not globally correlated, but cover the whole stimulus space. (C) 2000
Elsevier Science Ltd. All rights reserved.