Continuous-valued recurrent neural networks can learn mechanisms for proces
sing context-free languages. The dynamics of such networks is usually based
on damped oscillation around fixed points in state space and requires that
the dynamical components are arranged in certain ways. It is shown that qu
alitatively similar dynamics with similar constraints hold for a(n)b(n)c(n)
, a context-sensitive language. The additional difficulty with a(n)b(n)c(n)
, compared with the context-free language a(n)b(n), consists of 'counting u
p' and 'counting down' letters simultaneously. The network solution is to o
scillate in two principal dimensions, one for counting up and one for count
ing down. This study focuses on the dynamics employed by the sequential cas
caded network, in contrast to the simple recurrent network, and the use of
backpropagation through time. Found solutions generalize well beyond traini
ng data, however, learning is not reliable. The contribution of this study
lies in demonstrating how the dynamics in recurrent neural networks that pr
ocess context-free languages can also be employed in processing some contex
t-sensitive languages (traditionally thought of as requiring additional com
putation resources). This continuity of mechanism between language classes
contributes to our understanding of neural networks in modelling language l
earning and processing.