INCREMENTAL LEARNING OF COMPLEX TEMPORAL PATTERNS

Authors
Citation
Dl. Wang et B. Yuwono, INCREMENTAL LEARNING OF COMPLEX TEMPORAL PATTERNS, IEEE transactions on neural networks, 7(6), 1996, pp. 1465-1481
Citations number
44
Categorie Soggetti
Computer Application, Chemistry & Engineering","Engineering, Eletrical & Electronic","Computer Science Artificial Intelligence","Computer Science Hardware & Architecture","Computer Science Theory & Methods
ISSN journal
10459227
Volume
7
Issue
6
Year of publication
1996
Pages
1465 - 1481
Database
ISI
SICI code
1045-9227(1996)7:6<1465:ILOCTP>2.0.ZU;2-0
Abstract
A neural model for temporal pattern generation is used and analyzed fo r training with multiple complex sequences in a sequential manner. The network exhibits some degree of interference when new sequences are a cquired, It is proven that the model is capable of incrementally learn ing a finite number of complex sequences. The model is then evaluated with a large set of highly correlated sequences. While the number of i ntact sequences increases linearly with the number of previously acqui red sequences, the amount of retraining due to interference appears to be independent of the size of existing memory. The model is extended to include a chunking network which detects repeated subsequences betw een and within sequences. The chunking mechanism substantially reduces the amount of retraining in sequential training. Thus, the network in vestigated here constitutes an effective sequential memory, Various as pects of such a memory are discussed.