We shall present here a general study of minimum contrast estimators i
n a nonparametric setting (although our results are also valid in the
classical parametric case) for independent observations. These estimat
ors include many of the most popular estimators in various situations
such as maximum likelihood estimators, least squares and other estimat
ors of the regression function, estimators for mixture models or decon
volution... The main theorem relates the rate of convergence of those
estimators to the entropy structure of the space of parameters. Optima
l rates depending on entropy conditions are already known, at least fo
r some of the models involved, and they agree with what we get for min
imum contrast estimators as long as the entropy counts are not too lar
ge. But, under some circumstances (''large'' entropies or changes in t
he entropy structure due to local perturbations), the resulting rates
are only suboptimal. Counterexamples are constructed which show that t
he phenomenon is real for non-parametric maximum likelihood or regress
ion. This proves that, under purely metric assumptions, our theorem is
optimal and that minimum contrast estimators happen to be suboptimal.