Recurrent neural networks can simulate any finite state automata as we
ll as any multi-stack Turing machine. When constraining the network ar
chitecture, however, this computational power may no longer hold. For
example, recurrent cascade-correlation cannot simulate any finite stat
e automata. Thus, it is important to assess the computational power of
a given network architecture, since this characterizes the class of f
unctions which, in principle, can be computed by it. We discuss the co
mputational power of neural networks for structures. Elman-style netwo
rks, cascade-correlation networks and neural trees for structures are
introduced We show that Elman-style networks can simulate any frontier
-to-root tree automation while neither cascade-correlation networks no
r neural trees can. As a special case of the latter result, we obtain
that neural trees for sequences cannot simulate any finite state machi
ne. (C) 1997 Elsevier Science Ltd.