ON THE COMPUTATIONAL POWER OF RECURRENT NEURAL NETWORKS FOR STRUCTURES

Authors
Citation
A. Sperduti, ON THE COMPUTATIONAL POWER OF RECURRENT NEURAL NETWORKS FOR STRUCTURES, Neural networks, 10(3), 1997, pp. 395-400
Citations number
20
Categorie Soggetti
Mathematical Methods, Biology & Medicine","Computer Sciences, Special Topics","Computer Science Artificial Intelligence",Neurosciences,"Physics, Applied
Journal title
ISSN journal
08936080
Volume
10
Issue
3
Year of publication
1997
Pages
395 - 400
Database
ISI
SICI code
0893-6080(1997)10:3<395:OTCPOR>2.0.ZU;2-N
Abstract
Recurrent neural networks can simulate any finite state automata as we ll as any multi-stack Turing machine. When constraining the network ar chitecture, however, this computational power may no longer hold. For example, recurrent cascade-correlation cannot simulate any finite stat e automata. Thus, it is important to assess the computational power of a given network architecture, since this characterizes the class of f unctions which, in principle, can be computed by it. We discuss the co mputational power of neural networks for structures. Elman-style netwo rks, cascade-correlation networks and neural trees for structures are introduced We show that Elman-style networks can simulate any frontier -to-root tree automation while neither cascade-correlation networks no r neural trees can. As a special case of the latter result, we obtain that neural trees for sequences cannot simulate any finite state machi ne. (C) 1997 Elsevier Science Ltd.