Monotonically convergent algorithms are described for maximizing six (
constrained) functions of vectors x, or matrices X with columns x(1),.
.., x(r). These functions are h(1)(x) = Sigma(k) (x'A(k)x)(x'C(k)x)(-1
), H-1(X) = Sigma(k) tr (X'A(k)X)(X'C(k)X)(-1), (h) over tilde(1)(X) =
Sigma(k) Sigma(l)(x'(l)A(k)x(l))(x'(l)C(k)x(l))(-1) with X constraine
d to be columnwise orthonormal, h(2)(x) = Sigma(k) (x'A(k)x)(2)(x'C(k)
X)(-1) subject to x'x = 1, H-2(X) = Sigma(k) tr (X'A(k)X)(X'A(k)X)'(X'
C(k)X)(-1) subject to X'X = I, and (h) over tilde(2)(X) = Sigma(k) Sig
ma(l) (x'(l)A(k)x(l))(2)(x'(l)C(k)x(l))(-1) subject to X'X = I. In the
se functions the matrices C-k are assumed to be positive definite. The
matrices A(k) can be arbitrary square matrices. The general formulati
on of the functions and the algorithms allows for application of the a
lgorithms in various problems that arise in multivariate analysis. Sev
eral applications of the general algorithms are given. Specifically, a
lgorithms are given for reciprocal principal components analysis, bino
rmamin rotation, generalized discriminant analysis, variants of genera
lized principal components analysis, simple structure rotation for one
of the latter variants, and set component analysis. For most of these
methods the algorithms appear to be new, for the others the existing
algorithms turn out to be special cases of the newly derived general a
lgorithms.