The generative topographic mapping (GTM) model was introduced by Bishop et
al. (1998, Neural Comput. 10(1), 215-234) as a probabilistic re-formulation
of the self-organizing map (SOM). It offers a number of advantages compare
d with the standard SOM, and has already been used in a variety of applicat
ions. In this paper we report on several extensions of the GTM, including a
n incremental version of the EM algorithm for estimating the model paramete
rs, the use of local subspace models, extensions to mixed discrete and cont
inuous data, semi-linear models which permit the use of high-dimensional ma
nifolds whilst avoiding computational intractability, Bayesian inference ap
plied to hyper-parameters, and an alternative framework for the GTM based o
n Gaussian processes. All of these developments directly exploit the probab
ilistic structure of the GTM, thereby allowing the underlying modelling ass
umptions to be made explicit. They also highlight the advantages of adoptin
g a consistent probabilistic framework for the formulation of pattern recog
nition algorithms. (C) 1998 Elsevier Science B.V. All rights reserved.