A dynamic regularized radial basis function network for nonlinear, nonstationary time series prediction

Authors
Citation
P. Yee et S. Haykin, A dynamic regularized radial basis function network for nonlinear, nonstationary time series prediction, IEEE SIGNAL, 47(9), 1999, pp. 2503-2521
Citations number
40
Categorie Soggetti
Eletrical & Eletronics Engineeing
Journal title
IEEE TRANSACTIONS ON SIGNAL PROCESSING
ISSN journal
1053587X → ACNP
Volume
47
Issue
9
Year of publication
1999
Pages
2503 - 2521
Database
ISI
SICI code
1053-587X(199909)47:9<2503:ADRRBF>2.0.ZU;2-F
Abstract
In this paper, constructive approximation theorems are given which show tha t under certain conditions, the standard Nadaraya-Watson regression estimat e (NWRE) can be considered a specially regularized form of radial basis fun ction networks (RBFN's), From this and another related result, we deduce th at regularized RBFN's are m.s. consistent, like the NWRF for the one-step-a head prediction of Markovian nonstationary, nonlinear autoregressive time s eries generated by i.i.d. noise processes, Additionally, choosing the regul arization parameter to be asymptotically optimal gives regularized RBFN's t he advantage of asymptotically realizing minimum m.s. prediction error. Two update algorithms (one with augmented networks/infinite memory and the oth er with fixed-size networks/finite memory) are then proposed to deal with n onstationarity induced by time-varying regression functions. For the latter algorithm, tests on several phonetically balanced male and female speech s amples show an average 2,2-dB improvement in the predicted signal/noise (er ror) ratio over corresponding adaptive linear predictors using the exponent ially-weighted RLS algorithm. Further RLS filtering of the predictions from an ensemble of three such RBFN's combined with the usual autoregressive in puts increases the improvement to 4.2 dB, on average, over the linear predi ctors.