Stochastic analysis of gradient adaptive identification of nonlinear systems with memory for Gaussian data and noisy input and output measurements

Citation
Nj. Bershad et al., Stochastic analysis of gradient adaptive identification of nonlinear systems with memory for Gaussian data and noisy input and output measurements, IEEE SIGNAL, 47(3), 1999, pp. 675-689
Citations number
24
Categorie Soggetti
Eletrical & Eletronics Engineeing
Journal title
IEEE TRANSACTIONS ON SIGNAL PROCESSING
ISSN journal
1053587X → ACNP
Volume
47
Issue
3
Year of publication
1999
Pages
675 - 689
Database
ISI
SICI code
1053-587X(199903)47:3<675:SAOGAI>2.0.ZU;2-M
Abstract
This paper investigates the statistical behavior of two gradient search ada ptive algorithms for identifying an unknown nonlinear system comprised of a discrete-time linear system H followed by a zero-memory nonlinearity g(.). The input and output of the unknown system are corrupted by additive indep endent noises. Gaussian models are used for all inputs. Two competing adapt ation schemes are analyzed, The first is a sequential adaptation scheme whe re the LMS algorithm is first used to estimate the linear portion of the un known system. The LMS algorithm is able to identify the linear portion of t he unknown system to within a scale factor. The weights are then frozen at the end of the first adaptation phase. Recursions are derived for the mean and fluctuation behavior of the LMS algorithm, which are in excellent agree ment with Monte Carlo simulations. When the nonlinearity is modeled by a sc aled error function, the second part of the sequential gradient identificat ion scheme is shown to correctly learn the scale factor and the error funct ion scale factor. Mean recursions for the scale factors show good agreement with Monte Carlo simulations. For slow learning, the stationary points of the gradient algorithm closely agree with the stationary points of the theo retical recursions, The second adaptive scheme simultaneously learns both t he linear and nonlinear portions of the unknown channel. The mean recursion s for the linear and nonlinear portions show good agreement with Monte Carl o simulations for slow learning. The stationary points of the gradient algo rithm also agree with the stationary points of the theoretical recursions.