This paper presents a formal framework for deriving partial least squa
res algorithms from statistical hypothesis testing. This new formulati
on, significance regression (SR), leads to partial least squares for s
calar output problems (PLS1), to a close approximation of a common mul
tivariable partial least squares algorithm (PLS2) under certain model
assumptions and to more general methods under less restrictive model a
ssumptions. For models with multiple outputs, SR will be shown to have
certain advantages over PLS2 Using the new formulation, a significanc
e test is advanced for determining the number of directions to be used
. The prediction and estimation properties of SR are discussed. A brie
f numerical example illustrates the relationship between SR and PLS2.
(C) 1997 by John Wiley & Sons, Ltd.