Independent component analysis (ICA) is a signal processing technique in wh
ich a set of random variables are represented in terms of a set of underlyi
ng independent component variables. The most central application is the bli
nd source separation for time domain signals. Most approaches to the ICA pr
oblem start from information theoretic criteria like maximum likelihood or
maximum entropy, and result in numerical on-line learning algorithms. We em
phasize here the connection of ICA to neural learning, especially constrain
ed Hebbian learning rules that are nonlinear extensions of principal compon
ent analysis learning rules introduced by the author. We review results sho
wing that the nonlinearities in the learning rules are not critical but can
be chosen so that the learning rules not only produce independent componen
ts, but also have other desirable properties like robustness or fast conver
gence, contrary to the often used polynomial functions ensuing from cumulan
t expansions. Also fast batch versions of the learning rules have been deve
loped. Some results are given on the stationary points and their asymptotic
stability. It is pointed out that sigmoid-shaped nonlinear functions are a
good choice from several points of view. (C) 1998 Elsevier Science B.V. Al
l rights reserved.