The limiter function is used in many learning and retrieval models as
the constraint controlling the magnitude of the weight or state vector
s. In this paper, we developed a new method to relate the set of satur
ated fixed points to the set of system parameters of the models that u
se the limiter function, and then, as a case study, applied this metho
d to Linsker's Hebbian learning network. We derived a necessary and su
fficient condition to test whether a given saturated weight or state v
ector is stable or not for any given set of system parameters, and use
d this condition to determine the whole regime in the parameter space
over which the given state is stable. This approach allows us to inves
tigate the relative stability of the major receptive fields reported i
n Linsker's simulations, and to demonstrate the crucial role played by
the synaptic density functions.