DYNAMIC ONLINE CLUSTERING AND STATE EXTRACTION - AN APPROACH TO SYMBOLIC LEARNING

Authors
Citation
S. Das et M. Mozer, DYNAMIC ONLINE CLUSTERING AND STATE EXTRACTION - AN APPROACH TO SYMBOLIC LEARNING, Neural networks, 11(1), 1998, pp. 53-64
Citations number
19
Categorie Soggetti
Computer Science Artificial Intelligence","Computer Science Artificial Intelligence
Journal title
ISSN journal
08936080
Volume
11
Issue
1
Year of publication
1998
Pages
53 - 64
Database
ISI
SICI code
0893-6080(1998)11:1<53:DOCASE>2.0.ZU;2-1
Abstract
Although recurrent neural nets have been moderately successful in lear ning to emulate finite-state machines (FSMs), the continuous internal state dynamics of a neural net are not well matched to the discrete be havior of an FSM. We describe an architecture, called DOLCE, that allo ws discrete states to evolve in a net as learning progresses. DOLCE co nsists of a standard recurrent neural net trained by gradient descent and an adaptive clustering technique that quantizes the state space. W e describe two implementations of DOLCE. The first implementation, cal led DOLCEu, uses an adaptive clustering scheme in an unsupervised mode to determine both the number of clusters and the partitioning of the state space as learning progresses. The second model, DOLCEs, uses a G aussian Mixture Model in a supervised learning framework to infer the states of an FSM. DOLCEs is based on the assumption that a finite set of discrete internal states is required for the task, and that the act ual network state belongs to this set but has been corrupted by noise due to inaccuracy in the weights. DOLCEs learns to recover the discret e state with maximum a posteriori probability from the noisy state. Si mulations show that both implementations of DOLCE lead to a significan t improvement in generalization performance over earlier neural net ap proaches to FSM induction. The idea of adaptive quantization is not ju st applicable to DOLCE but can be applied to other domains as well. (C ) 1998 Elsevier Science Ltd. All rights reserved.