Nonparametric regression using locally weighted least squares was firs
t discussed by Stone and by Cleveland. Recently, it was shown by Fan a
nd by Fan and Gijbels that the local linear kernel-weighted least squa
res regression estimator has asymptotic properties making it superior,
in certain senses, to the Nadaraya-Watson and Gasser-Muller kernel es
timators. In this paper we extend their results on asymptotic bias and
variance to the case of multivariate predictor variables. We are able
to derive the leading bias and variance terms for general multivariat
e kernel weights using weighted least squares matrix theory. This appr
oach is especially convenient when analyzing the asymptotic conditiona
l bias and variance of the estimator at points near the boundary of th
e support of the predictors. We also investigate the asymptotic proper
ties of the multivariate local quadratic least squares regression esti
mator discussed by Cleveland and Devlin and, in the univariate case, h
igher-order polynomial fits and derivative estimation.