We examine the performance of Hebbian-like attractor neural networks,
recalling stored memory patterns from their distorted versions. Search
ing for an activation (firing-rate) function that maximizes the perfor
mance in sparsely connected low-activity networks, we show that the op
timal activation function is a threshold-sigmoid of the neuron's input
field, This function is shown to be in close correspondence with the
dependence of the firing rate of cortical neurons on their integrated
input current, as described by neurophysiological recordings and condu
ction-based models. It also accounts for the decreasing-density shape
of firing rates that has been reported in the literature.