A unifying information-theoretic framework for independent component analysis

Citation
Tw. Lee et al., A unifying information-theoretic framework for independent component analysis, COMPUT MATH, 39(11), 2000, pp. 1-21
Citations number
85
Categorie Soggetti
Computer Science & Engineering
Journal title
COMPUTERS & MATHEMATICS WITH APPLICATIONS
ISSN journal
08981221 → ACNP
Volume
39
Issue
11
Year of publication
2000
Pages
1 - 21
Database
ISI
SICI code
0898-1221(200006)39:11<1:AUIFFI>2.0.ZU;2-R
Abstract
We show that different theories recently proposed for independent component analysis (ICA) lead to the same iterative learning algorithm for blind sep aration of mixed independent sources. We review those theories and suggest that information theory can be used to unify several lines of research. Pea rlmutter and Parra [1] and Cardoso [2] showed that the infomax approach of Bell and Sejnowski [3] and the maximum likelihood estimation approach are e quivalent. We show that negentropy maximization also has equivalent propert ies, and therefore, all three approaches yield the same learning rule for a fixed nonlinearity. Girolami and Fyfe [4] have shown that the nonlinear pr incipal component analysis (PCA) algorithm of Karhunen and Joutsensalo [5] and Oja [6] can also be viewed from information-theoretic principles since it minimizes the sum of squares of the fourth-order marginal cumulants, and therefore, approximately minimizes the mutual information [7]. Lambert [8] has proposed different Bussgang cost functions for multichannel blind deco nvolution. We show how the Bussgang property relates to the infomax princip le. Finally, we discuss convergence and stability as well as future researc h issues in blind source separation. (C) 2000 Elsevier Science Ltd. All rig hts reserved.