RATES OF CONVERGENCE FOR MINIMUM CONTRAST ESTIMATORS

Authors
Citation
L. Birge et P. Massart, RATES OF CONVERGENCE FOR MINIMUM CONTRAST ESTIMATORS, Probability theory and related fields, 97(1-2), 1993, pp. 113-150
Citations number
37
Categorie Soggetti
Statistic & Probability","Statistic & Probability
ISSN journal
01788051
Volume
97
Issue
1-2
Year of publication
1993
Pages
113 - 150
Database
ISI
SICI code
0178-8051(1993)97:1-2<113:ROCFMC>2.0.ZU;2-N
Abstract
We shall present here a general study of minimum contrast estimators i n a nonparametric setting (although our results are also valid in the classical parametric case) for independent observations. These estimat ors include many of the most popular estimators in various situations such as maximum likelihood estimators, least squares and other estimat ors of the regression function, estimators for mixture models or decon volution... The main theorem relates the rate of convergence of those estimators to the entropy structure of the space of parameters. Optima l rates depending on entropy conditions are already known, at least fo r some of the models involved, and they agree with what we get for min imum contrast estimators as long as the entropy counts are not too lar ge. But, under some circumstances (''large'' entropies or changes in t he entropy structure due to local perturbations), the resulting rates are only suboptimal. Counterexamples are constructed which show that t he phenomenon is real for non-parametric maximum likelihood or regress ion. This proves that, under purely metric assumptions, our theorem is optimal and that minimum contrast estimators happen to be suboptimal.