Constrained principal component analysis (CPCA) incorporates external infor
mation into principal component analysis (PCA) of a data matrix. CPCA first
decomposes the data matrix according to the external information (external
analysis), and then applies PCA to decomposed matrices (internal analysis)
. The external analysis amounts to projections of the data matrix onto the
spaces spanned by matrices of external information, while the internal anal
ysis involves the generalized singular value decomposition (GSVD). Since it
s original proposal, CPCA has evolved both conceptually and methodologicall
y; it is now founded on firmer mathematical ground, allows a greater variet
y of decompositions, and includes a wider range of interesting special case
s. In this paper we present a comprehensive theory and various extensions o
f CPCA, which were not fully envisioned in the original paper. The new deve
lopments we discuss include least squares (LS) estimation under possibly si
ngular metric matrices, two useful theorems concerning GSVD, decompositions
of data matrices into finer components., and fitting higher-order structur
es. We also discuss four special cases of CPCA; 1) CCA (canonical correspon
dence analysis) and CALC (canonical analysis with linear constraints), 2) G
MANOVA (generalized MANOVA), 3) Lagrange's theorem, and 4) CANO (canonical
correlation analysis) and related methods. We conclude with brief remarks o
n advantages and disadvantages of CPCA relative to other competitors.