RECURRENT NEURAL NETWORKS AND FINITE AUTOMATA

Authors
Citation
Ht. Siegelmann, RECURRENT NEURAL NETWORKS AND FINITE AUTOMATA, Computational intelligence, 12(4), 1996, pp. 567-574
Citations number
17
Categorie Soggetti
Computer Sciences, Special Topics","Computer Science Artificial Intelligence
Journal title
ISSN journal
08247935
Volume
12
Issue
4
Year of publication
1996
Pages
567 - 574
Database
ISI
SICI code
0824-7935(1996)12:4<567:RNNAFA>2.0.ZU;2-S
Abstract
This article studies finite size networks that consist of interconnect ions of synchronously evolving processors: Each processor updates its state by applying an activation function to a Linear combination of th e previous states of all units. We prove that any function for which t he left and right limits exist and are different can be applied to the neurons to yield a network which is at least as strong computationall y as a finite automaton. we conclude that if this is the power require d, one may choose any of the aforementioned neurons, according to the hardware available or the learning software preferred for the particul ar application.