SMOOTH BACKFITTING FOR ERRORS-IN-VARIABLES ADDITIVE MODELS

Citation
Kyunghee Han et Byeong U. Park, SMOOTH BACKFITTING FOR ERRORS-IN-VARIABLES ADDITIVE MODELS, Annals of statistics , 46(5), 2018, pp. 2216-2250
Journal title
ISSN journal
00905364
Volume
46
Issue
5
Year of publication
2018
Pages
2216 - 2250
Database
ACNP
SICI code
Abstract
In this work, we develop a new smooth backfitting method and theory for estimating additive nonparametric regression models when the covariates are contaminated by measurement errors. For this, we devise a new kernel function that suitably deconvolutes the bias due to measurement errors as well as renders a projection interpretation to the resulting estimator in the space of additive functions. The deconvolution property and the projection interpretation are essential for a successful solution of the problem. We prove that the method based on the new kernel weighting scheme achieves the optimal rate of convergence in one-dimensional deconvolution problems when the smoothness of measurement error distribution is less than a threshold value. We find that the speed of convergence is slower than the univariate rate when the smoothness of measurement error distribution is above the threshold, but it is still much faster than the optimal rate in multivariate deconvolution problems. The theory requires a deliberate analysis of the nonnegligible effects of measurement errors being propagated to other additive components through backfitting operation.We present the finite sample performance of the deconvolution smooth backfitting estimators that confirms our theoretical findings.