A neural model for temporal pattern generation is used and analyzed fo
r training with multiple complex sequences in a sequential manner. The
network exhibits some degree of interference when new sequences are a
cquired, It is proven that the model is capable of incrementally learn
ing a finite number of complex sequences. The model is then evaluated
with a large set of highly correlated sequences. While the number of i
ntact sequences increases linearly with the number of previously acqui
red sequences, the amount of retraining due to interference appears to
be independent of the size of existing memory. The model is extended
to include a chunking network which detects repeated subsequences betw
een and within sequences. The chunking mechanism substantially reduces
the amount of retraining in sequential training. Thus, the network in
vestigated here constitutes an effective sequential memory, Various as
pects of such a memory are discussed.