It has been shown that if a recurrent neural network (RNN) learns to proces
s a regular language, one can extract a finite-state machine (FSM) by treat
ing regions of phase-space as FSM states. However, it has also been shown t
hat one can construct an RNN to implement Turing machines by using RNN dyna
mics as counters. But how does a network learn languages that require count
ing? Rodriguez, Wiles, and Elman (1999) showed that a simple recurrent netw
ork (SRN) can learn to process a simple context-free language (CFL) by coun
ting up and down. This article extends that to show a range of language tas
ks in which an SRN develops solutions that not only count but also copy and
store counting information. In one case, the network stores information li
ke an explicit storage mechanism. In other cases, the network stores inform
ation more indirectly in trajectories that are sensitive to slight displace
ments that depend on context. In this sense, an SRN can learn analog comput
ation as a set of interdependent counters. This demonstrates how SRNs may b
e an alternative psychological model of language or sequence processing.