A novel steepest descent (NSD) algorithm and its improved version (NSD
M algorithm) for adaptive filters have been suggested and the local co
nvergence analysis of the NSD algorithm has been performed recently. I
n this paper we present two main results. The first is a performance a
nalysis of the NSDM algorithm for adaptive filters with correlated Gau
ssian data based on the expected global behaviour approach in nonstati
onary environments as well as in stationary case. The second is an ext
ension of the previous analysis for the NSD algorithm in stationary ca
se to nonstationary environments. The results from these analyses are
verified numerically through computer simulations for an adaptive syst
em identification example with highly correlated input data.