Mixtures of probabilistic principal component analyzers

Citation
Me. Tipping et Cm. Bishop, Mixtures of probabilistic principal component analyzers, NEURAL COMP, 11(2), 1999, pp. 443-482
Citations number
34
Categorie Soggetti
Neurosciences & Behavoir","AI Robotics and Automatic Control
Journal title
NEURAL COMPUTATION
ISSN journal
08997667 → ACNP
Volume
11
Issue
2
Year of publication
1999
Pages
443 - 482
Database
ISI
SICI code
0899-7667(19990215)11:2<443:MOPPCA>2.0.ZU;2-D
Abstract
Principal component analysis (PCA) is one of the most popular techniques fo r processing, compressing, and visualizing data, although its effectiveness is limited by its global linearity. While nonlinear variants of PCA have b een proposed, an alternative paradigm is to capture data complexity by a co mbination of local linear PCA projections. However, conventional PCA does n ot correspond to a probability density, and so there is no unique way to co mbine PCA models. Therefore, previous attempts to formulate mixture models for PCA have been ad hoc to some extent. In this article, PCA is formulated within a maximum likelihood framework, based on a specific form of gaussia n latent variable model. This leads to a well-defined mixture model for pro babilistic principal component analyzers, whose parameters can be determine d using an expectation-maximization algorithm. We discuss the advantages of this model in the context of clustering, density modeling, and local dimen sionality reduction, and we demonstrate its application to image compressio n and handwritten digit recognition.