NONLINEAR FEEDFORWARD NETWORKS WITH STOCHASTIC OUTPUTS - INFOMAX IMPLIES REDUNDANCY REDUCTION

Citation
Jp. Nadal et al., NONLINEAR FEEDFORWARD NETWORKS WITH STOCHASTIC OUTPUTS - INFOMAX IMPLIES REDUNDANCY REDUCTION, Network, 9(2), 1998, pp. 207-217
Citations number
17
Categorie Soggetti
Computer Science Artificial Intelligence",Neurosciences,"Engineering, Eletrical & Electronic","Computer Science Artificial Intelligence
Journal title
ISSN journal
0954898X
Volume
9
Issue
2
Year of publication
1998
Pages
207 - 217
Database
ISI
SICI code
0954-898X(1998)9:2<207:NFNWSO>2.0.ZU;2-O
Abstract
We prove that maximization of mutual information between the output an d the input of a feedforward neural network leads to full redundancy r eduction under the following sufficient conditions: (i) the input sign al is a (possibly nonlinear) invertible mixture of independent compone nts; (ii) there is no input noise; (iii) the activity of each output n euron is a (possibly) stochastic variable with a probability distribut ion depending on the stimulus through a deterministic function of the inputs (where both the probability distributions and the functions can be different from neuron to neuron); (iv) optimization of the mutual information is performed over all these deterministic functions. This result extends that obtained by Nadal and Parga (1994) who considered the case of deterministic outputs.