G. Lattanzi et al., STOCHASTIC LEARNING IN A NEURAL-NETWORK WITH ADAPTING SYNAPSES, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics, 56(4), 1997, pp. 4567-4573
We consider a neural network with adapting synapses whose dynamics can
be analytically computed. The model is made of N neurons and each of
them is connected to K input neurons chosen at random in the network.
The synapses are n-state variables that evolve in time according to st
ochastic learning rules; a parallel stochastic dynamics is assumed for
neurons. Since the network maintains the same dynamics whether it is
engaged in computation or in learning new memories, a very low probabi
lity of synaptic transitions is assumed. In the Limit N --> infinity w
ith K large and finite, the correlations of neurons and synapses can b
e neglected and the dynamics can be analytically calculated by flow eq
uations for the macroscopic parameters of the system.