A learning algorithm for the principal component analysis is developed base
d on the least-square minimization, The dual learning rate parameters are a
djusted adaptively to make the proposed algorithm capable of fast convergen
ce and high accuracy for extracting all principal components. The proposed
algorithm is robust to the error accumulation existing in the sequential pr
incipal component analysis (PCA) algorithm, We show that all information ne
eded for PCA can be completely represented by the unnormalized weight vecto
r which is updated based only on the corresponding neuron input-output prod
uct. The updating of the normalized weight vector can be referred to as a l
eaky Hebb's rule. The convergence of the proposed algorithm is briefly anal
yzed. We also establish the relation between Oja's rule and the least squar
es learning rule. Finally, the simulation results are given to illustrate t
he effectiveness of this algorithm for PCA and tracking time-varying direct
ions-of-arrival.