This paper considers the problem of characterizing the stability prope
rties of the equilibria of an important class of recurrent neural netw
orks. Sufficient conditions are given under which the neural network p
ossesses a unique globally asymptotically stable equilibrium point for
each external input. These conditions are less restrictive than those
previously obtained and are easily checked, so that incorporating the
m in existing neural network design procedures should increase the fle
xibility and reduce the complexity of this synthesis process. Results
are provided for both continuous-time and discrete time networks.