Let the kp-variate random vector X-i be partitioned into k subvectors Xi of
dimension p each, and let the covariance matrix Psi of X be partitioned an
alogously into submatrices Psi (ij). The common principal component (CPC) m
odel for dependent random vectors assumes the existence of an orthogonal p
by p matrix beta such that beta (t)Psi (ij) beta is diagonal for all (i,j).
After a formal definition of the model, normal theory maximum likelihood e
stimators are obtained. The asymptotic theory for the estimated orthogonal
matrix is derived by a new technique of choosing proper subsets of function
ally independent parameters. (C) 2000 Academic Press.