Linear recursive filters can be adapted on-line but with instability proble
ms, Stability-control techniques exist, hut they are either computationally
expensive or nonrobust. For the nonlinear case, e.g., locally recurrent ne
ural networks, the stability of infinite-impulse response (IIR) synapses is
often a condition to be satisfied.
This brief considers the known reparametrization-for-stability method for t
he on-lint adaptation of IIR adaptive filters. A new technique is also pres
ented, based on the further adaptation of the squashing function, which all
ows to improve the convergence performance. The proposed method ran be appl
ied to various filter realizations (direct Forms, cascade or parallel of se
cond order sections, lattice form), as well as to locally recurrent neural
networks, such as the IIR multi-layer perceptron (IIR-MLP), with improved p
erformance with respect to other techniques and to the case of no stability
control. In this brief the case of normalized lattice filters is particula
rly considered; an analysis of the stabilization effects is also presented
both analytically and experimentally.