Robust recursive least squares learning algorithm for principal component analysis

Citation
Oy. Shan et al., Robust recursive least squares learning algorithm for principal component analysis, IEEE NEURAL, 11(1), 2000, pp. 215-221
Citations number
19
Categorie Soggetti
AI Robotics and Automatic Control
Journal title
IEEE TRANSACTIONS ON NEURAL NETWORKS
ISSN journal
10459227 → ACNP
Volume
11
Issue
1
Year of publication
2000
Pages
215 - 221
Database
ISI
SICI code
1045-9227(200001)11:1<215:RRLSLA>2.0.ZU;2-F
Abstract
A learning algorithm for the principal component analysis is developed base d on the least-square minimization, The dual learning rate parameters are a djusted adaptively to make the proposed algorithm capable of fast convergen ce and high accuracy for extracting all principal components. The proposed algorithm is robust to the error accumulation existing in the sequential pr incipal component analysis (PCA) algorithm, We show that all information ne eded for PCA can be completely represented by the unnormalized weight vecto r which is updated based only on the corresponding neuron input-output prod uct. The updating of the normalized weight vector can be referred to as a l eaky Hebb's rule. The convergence of the proposed algorithm is briefly anal yzed. We also establish the relation between Oja's rule and the least squar es learning rule. Finally, the simulation results are given to illustrate t he effectiveness of this algorithm for PCA and tracking time-varying direct ions-of-arrival.