Training a perceptron in a discrete weight space - art. no. 046109

Citation
M. Rosen-zvi et I. Kanter, Training a perceptron in a discrete weight space - art. no. 046109, PHYS REV E, 6404(4), 2001, pp. 6109
Citations number
28
Categorie Soggetti
Physics
Journal title
PHYSICAL REVIEW E
ISSN journal
1063651X → ACNP
Volume
6404
Issue
4
Year of publication
2001
Part
2
Database
ISI
SICI code
1063-651X(200110)6404:4<6109:TAPIAD>2.0.ZU;2-L
Abstract
Learning in a perceptron having a discrete weight space, where each weight can take 2L+1 different values, is examined analytically and numerically. T he learning algorithm is based on the training of the continuous perceptron and prediction following the clipped weights. The learning is described by a new set of order parameters, composed of the overlaps between the teache r and the continuous/clipped students. Different scenarios are examined, am ong them on-line learning with discrete and continuous transfer functions. The generalization error of the clipped weights decays asymptotically as ex p(-K alpha (2)) in the case of on-line learning with binary activation func tions and exp(-e(\ lambda \ alpha)) in the case of on-line learning with co ntinuous one, where alpha is the number of examples divided by N, the size of the input vector and K is a positive constant. For finite N and L, perfe ct agreement between the discrete student and the teacher is obtained for a lpha proportional toL root 1n(NL). A crossover to the generalization error proportional to1/alpha, characterizing continuous weights with binary outpu t, is obtained for synaptic depth L>O(rootN).