We consider a linear, one-layer feedforward neural network performing
a coding task. The goal of the network is to provide a statistical neu
ral representation that conveys as much information as possible on the
input stimuli in noisy conditions. We determine the family of synapti
c couplings that maximizes the mutual information between input and ou
tput distribution. Optimization is performed under different constrain
ts on the synaptic efficacies. We analyse the dependence of the soluti
ons on input and output noises. This work goes beyond previous studies
of the same problem in that: (i) we perform a detailed stability anal
ysis in order to find the global maxima of the mutual information; (ii
) we examine the properties of the optimal synaptic configurations und
er different constraints; (iii) we do not assume translational invaria
nce of the input data, as it is usually done when inputs are assumed t
o be visual stimuli.