From neural learning to independent components

Authors
Citation
E. Oja, From neural learning to independent components, NEUROCOMPUT, 22(1-3), 1998, pp. 187-199
Citations number
30
Categorie Soggetti
AI Robotics and Automatic Control
Journal title
NEUROCOMPUTING
ISSN journal
09252312 → ACNP
Volume
22
Issue
1-3
Year of publication
1998
Pages
187 - 199
Database
ISI
SICI code
0925-2312(199811)22:1-3<187:FNLTIC>2.0.ZU;2-P
Abstract
Independent component analysis (ICA) is a signal processing technique in wh ich a set of random variables are represented in terms of a set of underlyi ng independent component variables. The most central application is the bli nd source separation for time domain signals. Most approaches to the ICA pr oblem start from information theoretic criteria like maximum likelihood or maximum entropy, and result in numerical on-line learning algorithms. We em phasize here the connection of ICA to neural learning, especially constrain ed Hebbian learning rules that are nonlinear extensions of principal compon ent analysis learning rules introduced by the author. We review results sho wing that the nonlinearities in the learning rules are not critical but can be chosen so that the learning rules not only produce independent componen ts, but also have other desirable properties like robustness or fast conver gence, contrary to the often used polynomial functions ensuing from cumulan t expansions. Also fast batch versions of the learning rules have been deve loped. Some results are given on the stationary points and their asymptotic stability. It is pointed out that sigmoid-shaped nonlinear functions are a good choice from several points of view. (C) 1998 Elsevier Science B.V. Al l rights reserved.