Finite word length computational effects of the principal component analysis networks

Citation
T. Szabo et G. Horvath, Finite word length computational effects of the principal component analysis networks, IEEE INSTR, 47(5), 1998, pp. 1218-1222
Citations number
12
Categorie Soggetti
Instrumentation & Measurement
Journal title
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT
ISSN journal
00189456 → ACNP
Volume
47
Issue
5
Year of publication
1998
Pages
1218 - 1222
Database
ISI
SICI code
0018-9456(199810)47:5<1218:FWLCEO>2.0.ZU;2-I
Abstract
This paper deals with some of the effects of finite precision data represen tation and arithmetics in principal component analysis (PCA) neural network s, The PCA networks are single layer linear neural networks that use some v ersions of Oja's learning rule. The paper concentrates on the effects of pr emature convergence or early termination of the learning process. It determ ines an approximate analytical expression of the lower limit of the learnin g rate parameter, Selecting the learning rate below this limit-which depend s on the statistical properties of the input data and the quantum size used in the finite precision arithmetics-the convergence will slow down signifi cantly or the learning process will stop before converging to the proper we ight vector.