Synapses play a central role in neural computation: the strengths of synapt
ic connections determine the function of a neural circuit. In conventional
models of computation, synaptic strength is assumed to be a static quantity
that changes only on the slow timescale of learning. In biological systems
, however, synaptic strength undergoes dynamic modulation on rapid timescal
es through mechanisms such as short term facilitation and depression. Here
we describe a general model of computation that exploits dynamic synapses,
and use a backpropagation-like algorithm to adjust the synaptic parameters.
We show that such gradient descent suffices to approximate a given quadrat
ic filter by a father small neural system with dynamic synapses. We also co
mpare our network model to artificial neural networks designed for time ser
ies processing. Our numerical results are complemented by theoretical analy
ses which show that even with just a single hidden layer such networks can
approximate a surprisingly large class of nonlinear filters: all filters th
at can be characterized by Volterra series. This result is robust with rega
rd to various changes in the model for synaptic dynamics.