We show that different theories recently proposed for independent component
analysis (ICA) lead to the same iterative learning algorithm for blind sep
aration of mixed independent sources. We review those theories and suggest
that information theory can be used to unify several lines of research. Pea
rlmutter and Parra [1] and Cardoso [2] showed that the infomax approach of
Bell and Sejnowski [3] and the maximum likelihood estimation approach are e
quivalent. We show that negentropy maximization also has equivalent propert
ies, and therefore, all three approaches yield the same learning rule for a
fixed nonlinearity. Girolami and Fyfe [4] have shown that the nonlinear pr
incipal component analysis (PCA) algorithm of Karhunen and Joutsensalo [5]
and Oja [6] can also be viewed from information-theoretic principles since
it minimizes the sum of squares of the fourth-order marginal cumulants, and
therefore, approximately minimizes the mutual information [7]. Lambert [8]
has proposed different Bussgang cost functions for multichannel blind deco
nvolution. We show how the Bussgang property relates to the infomax princip
le. Finally, we discuss convergence and stability as well as future researc
h issues in blind source separation. (C) 2000 Elsevier Science Ltd. All rig
hts reserved.