Mixture of experts architectures for neural networks as a special case of conditional expectation formula.

Authors
Citation
J. Grim, Mixture of experts architectures for neural networks as a special case of conditional expectation formula., KYBERNETIKA, 34(4), 1998, pp. 417-422
Citations number
14
Categorie Soggetti
AI Robotics and Automatic Control
Journal title
KYBERNETIKA
ISSN journal
00235954 → ACNP
Volume
34
Issue
4
Year of publication
1998
Pages
417 - 422
Database
ISI
SICI code
0023-5954(1998)34:4<417:MOEAFN>2.0.ZU;2-U
Abstract
Recently a new interesting architecture of neural networks called "mixture of experts" has been proposed as a tool of real multivariate approximation or prediction. We show that the underlying problem is closely related to ap proximating the joint probability density of involved variables by finite m ixture. Particularly, assuming normal mixtures, we can explicitly write the conditional expectation formula which can be interpreted as a mixture-of-e xperts network. In this way the related optimization problem can be reduced to standard estimation of normal mixtures by means of EM algorithm. The re sulting prediction is optimal in the sense of minimum dispersion if the ass umed mixture model is true. It is shown that some of the recently published results can be obtained by specifying the normal components of mixtures in a special form.