The contextual layered associative memory (CLAM) has been developed as
a self-generating structure which implements a probabilistic encoding
scheme. The training algorithms are geared towards the unsupervised g
eneration of a layerable associative mapping (Thacker & Mayhew, 1989).
We show here that the resulting structure will support layers which c
an be trained to produce outputs that approximate conditional probabil
ities of classification. Unsupervised and supervised learning algorith
ms operate independently permitting the unsupervised representational
layer to be developed before supervision is available. The system thus
supports learning which is inherently more flexible than conventional
node labelling schemes. (C) 1997 Elsevier Science Ltd. All rights res
erved.