Noisy time series prediction using recurrent neural networks and grammatical inference

Citation
Cl. Giles et al., Noisy time series prediction using recurrent neural networks and grammatical inference, MACH LEARN, 44(1-2), 2001, pp. 161-183
Citations number
64
Categorie Soggetti
AI Robotics and Automatic Control
Journal title
MACHINE LEARNING
ISSN journal
08856125 → ACNP
Volume
44
Issue
1-2
Year of publication
2001
Pages
161 - 183
Database
ISI
SICI code
0885-6125(2001)44:1-2<161:NTSPUR>2.0.ZU;2-U
Abstract
Financial forecasting is an example of a signal processing problem which is challenging due to small sample sizes, high noise, non-stationarity, and n on-linearity. Neural networks have been very successful in a number of sign al processing applications. We discuss fundamental limitations and inherent difficulties when using neural networks for the processing of high noise, small sample size signals. We introduce a new intelligent signal processing method which addresses the difficulties. The method proposed uses conversi on into a symbolic representation with a self-organizing map, and grammatic al inference with recurrent neural networks. We apply the method to the pre diction of daily foreign exchange rates, addressing difficulties with non-s tationarity, overfitting, and unequal a priori class probabilities, and we find significant predictability in comprehensive experiments covering 5 dif ferent foreign exchange rates. The method correctly predicts the direction of change for the next day with an error rate of 47.1%. The error rate redu ces to around 40% when rejecting examples where the system has low confiden ce in its prediction. We show that the symbolic representation aids the ext raction of symbolic knowledge from the trained recurrent neural networks in the form of deterministic finite state automata. These automata explain th e operation of the system and are often relatively simple. Automata rules r elated to well known behavior such as tr end following and mean reversal ar e extracted.