In the case of prior knowledge about the unknown parameter, the Bayesian pr
edictive density coincides with the Bayes estimator for the true density in
the sense of the Kullback-Leibler divergence, but this is no longer true i
f we consider another loss function. In this paper we present a generalized
Bayes rule to obtain Bayes density estimators with respect to any alpha-di
vergence, including the Kullback-Leibler divergence and the Hellinger dista
nce. For curved exponential models, we study the asymptotic behaviour of th
ese predictive densities. We show that, whatever prior we use, the generali
zed Bayes rule improves (in a non-Bayesian sense) the estimative density co
rresponding to a bias modification of the maximum likelihood estimator. It
gives rise to a correspondence between choosing a prior density for the gen
eralized Bayes rule and fixing a bias for the maximum likelihood estimator
in the classical setting. A criterion for comparing and selecting prior den
sities is also given.