Penalized likelihood methods provide a range of practical modelling tools,
including spline smoothing, generalized additive models and variants of rid
ge regression. Selecting the correct weights for penalties is a critical pa
rt of using these methods and in the single-penalty case the analyst has se
veral well-founded techniques to choose from. However, many modelling probl
ems suggest a formulation employing multiple penalties, and here general me
thodology is lacking. A wide family of models with multiple penalties can b
e fitted to data by iterative solution of the generalized ridge regression
problem minimize parallel to W-1/2 (Xp - y)parallel to(2) rho + Sigma(i=1)(
m)theta(i)p'S(i)p (p is a parameter vector, X a design matrix, S-i a non-ne
gative definite coefficient matrix defining the ith penalty with associated
smoothing parameter theta(i), W a diagonal weight matrix, y a vector of da
ta or pseudodata and rho an 'overall' smoothing parameter included for comp
utational efficiency). This paper shows how smoothing parameter selection c
an be performed efficiently by applying generalized cross-validation to thi
s problem and how this allows non-linear, generalized linear and linear mod
els to be fitted using multiple penalties, substantially increasing the sco
pe of penalized modelling methods. Examples of non-linear modelling, genera
lized additive modelling and anisotropic smoothing are given.