When estimating a regression function or its derivatives, local polynomials
are art attractive choice due to their flexibility and asymptotic performa
nce. Seifert and Gasser proposed ridging of local polynomials to overcome p
roblems with variance for random design while retaining their advantages. I
n this article we present a data-independent rule of thumb and a data-adapt
ive spatial choice of the ridge parameter in local linear regression. In a
framework of penalized local least squares regression, the methods are gene
ralized to higher order polynomials, to estimation of derivatives, and to m
ultivariate designs. The main message is that ridging is a powerful tool fo
r improving the performance of local polynomials. A rule of thumb offers dr
astic improvements; data-adaptive ridging brings further but modest gains i
n mean square error.