DISTRIBUTED ARTMAP - A NEURAL-NETWORK FOR FAST DISTRIBUTED SUPERVISEDLEARNING

Citation
Ga. Carpenter et al., DISTRIBUTED ARTMAP - A NEURAL-NETWORK FOR FAST DISTRIBUTED SUPERVISEDLEARNING, Neural networks, 11(5), 1998, pp. 793-813
Citations number
33
Categorie Soggetti
Computer Science Artificial Intelligence","Computer Science Artificial Intelligence
Journal title
ISSN journal
08936080
Volume
11
Issue
5
Year of publication
1998
Pages
793 - 813
Database
ISI
SICI code
0893-6080(1998)11:5<793:DA-ANF>2.0.ZU;2-U
Abstract
Distributed coding at the hidden layer of a multi-layer perceptron (ML P) endows the network with memory compression and noise tolerance capa bilities. However, an MLP typically requires slow off-line learning to avoid catastrophic forgetting in an open input environment. An adapti ve resonance theory (ART) model is designed to guarantee stable memori es even with fast on-line learning. However, ART stability typically r equires winner-take-all coding, which may cause category proliferation in a noisy input environment. Distributed ARTMAP (dARTMAP) seeks to c ombine the computational advantages of MLP and ART systems in a real-t ime neural network for supervised learning. An implementation algorith m here describes one class of dARTMAP networks. This system incorporat es elements of the unsupervised dART model, as well as new features, i ncluding a content-addressable memory (CAM) rule for improved contrast control at the coding field. A dARTMAP system reduces to fuzzy ARTMAP when coding is winner-take-all. Simulations show that dARTMAP retains fuzzy ARTMAP accuracy while significantly improving memory compressio n. (C) 1998 Elsevier Science Ltd. All rights reserved.