LEARNING CONTINUOUS PROBABILITY-DISTRIBUTIONS WITH SYMMETRICAL DIFFUSION NETWORKS

Citation
Jr. Movellan et Jl. Mcclelland, LEARNING CONTINUOUS PROBABILITY-DISTRIBUTIONS WITH SYMMETRICAL DIFFUSION NETWORKS, Cognitive science, 17(4), 1993, pp. 463-496
Citations number
35
Categorie Soggetti
Psychology, Experimental
Journal title
ISSN journal
03640213
Volume
17
Issue
4
Year of publication
1993
Pages
463 - 496
Database
ISI
SICI code
0364-0213(1993)17:4<463:LCPWSD>2.0.ZU;2-X
Abstract
In this article we present symmetric diffusion networks, a family of n etworks that instantiate the principles of continuous, stochastic, ada ptive and interactive propagation of information. Using methods of Mar kovian diffusion theory, we formalize the activation dynamics of these networks and then show that they can be trained to reproduce entire m ultivariate probability distributions on their outputs using the contr astive Hebbian learning rule (CHL). We show that CHL performs gradient descent on an error function that captures differences between desire d and obtained continuous multivariate probability distributions. This allows the learning algorithm to go beyond expected values of output units and to approximate complete probability distributions on continu ous multivariate activation spaces. We argue that learning continuous distributions is an important task underlying a variety of real-life s ituations that were beyond the scope of previous connectionist network s. Deterministic networks, like back propagation, cannot learn this ta sk because they ore limited to learning average values of independent output units. Previous stochastic connectionist networks could learn p robability distributions but they were limited to discrete variables. Simulations show that symmetric diffusion networks con be trained with the CHL rule to approximate discrete and continuous probability distr ibutions of various types.