This paper is concerned with finite mixture models in which the popula
tions from one observation to the next are selected according to an un
observed Markov process. A new, full Bayesian approach based on the me
thod of Gibbs sampling is developed. Calculations are simplified by da
ta augmentation, achieved by introducing a population index variable i
nto the list of unknown parameters. It is shown that the latent variab
les, one for each observation, can be simulated from their joint distr
ibution given the data and the remaining parameters. This result serve
s to accelerate the convergence of the Gibbs sample, Modal estimates a
re also computed by stochastic versions of the EM algorithm. These pro
vide an alternative to a full Bayesian approach and to existing method
s of locating the maximum likelihood estimate. The ideas are applied i
n detail to Poisson data, mixtures of multivariate normal distribution
s, and autoregressive time series.