This paper deals with some of the effects of finite precision data represen
tation and arithmetics in principal component analysis (PCA) neural network
s, The PCA networks are single layer linear neural networks that use some v
ersions of Oja's learning rule. The paper concentrates on the effects of pr
emature convergence or early termination of the learning process. It determ
ines an approximate analytical expression of the lower limit of the learnin
g rate parameter, Selecting the learning rate below this limit-which depend
s on the statistical properties of the input data and the quantum size used
in the finite precision arithmetics-the convergence will slow down signifi
cantly or the learning process will stop before converging to the proper we
ight vector.