B. Ans et S. Rousset, AVOIDING CATASTROPHIC FORGETTING BY COUPLING 2 REVERBERATING NEURAL NETWORKS, Comptes rendus de l'Academie des sciences. Serie 3, Sciences de la vie, 320(12), 1997, pp. 989-997
Gradient descent learning procedures are most often used in neural net
work modeling. When these algorithms (e.g., backpropagation) are appli
ed to sequential learning tasks a major drawback termed catastrophic f
orgetting (or catastrophic interference), generally arises: when a net
work having already learned a first set of items is next trained on a
second set of items, the newly learned information may completely dest
roy the information previously learned To avoid this implausible failu
re, we propose a two-network architecture in which new items are learn
ed by a first network concurrently with internal pseudo-items originat
ing from a second network. As it is demonstrated that the pseudo-items
reflect the structure of items previously learned by the first networ
k the model thus implements a refreshing mechanism using the old infor
mation. The crucial point is that this refreshing mechanism is based o
n reverberating neural networks which need only random stimulations to
operate. The model thus provides a means to dramatically reduce retro
active interference while conserving the essentially distributed natur
e of information and proposes an original but plausible means to 'copy
and paste' a distributed memory from one place in the brain to anothe
r.