Symmetrically connected recurrent networks have recently been used as model
s of a host of neural computations. However, biological neural networks hav
e asymmetrical connections, at the very least because of the separation bet
ween excitatory and inhibitory neurons in the brain. We study characteristi
c differences between asymmetrical networks and their symmetrical counterpa
rts in cases for which they act as selective amplifiers for particular clas
ses of input patterns. We show that the dramatically different dynamical be
haviours to which they have access, often make the asymmetrical networks co
mputationally superior. We illustrate our results in networks that selectiv
ely amplify oriented bars and smooth contours in visual inputs.