We consider the estimation of the data model of independent component analy
sis when Gaussian noise is present. We show that the joint maximum likeliho
od estimation of the independent components and the mixing matrix leads to
an objective function already proposed by Olshausen and Field using a diffe
rent derivation. Due to the complicated nature of the objective function, w
e introduce approximations that greatly simplify the optimization problem.
We show that the presence of noise implies that the relation between the ob
served data and the estimates of the independent components is non-linear,
and show how to approximate this non-linearity. In particular, the non-line
arity may be approximated by a simple shrinkage operation in the case of su
per-Gaussian (sparse) data. Using these approximations, we propose an effic
ient algorithm for approximate maximization of the likelihood. In the case
of super-Gaussian components, this may be approximated by simple competitiv
e learning, and in the case of sub-Gaussian components, by anti-competitive
learning. (C) 1998 Elsevier Science B.V. All rights reserved.