FINITE-SIZE-SCALING OF THE BAYESIAN PERCEPTRON

Citation
A. Buhot et al., FINITE-SIZE-SCALING OF THE BAYESIAN PERCEPTRON, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics, 55(6), 1997, pp. 7434-7440
Citations number
14
Categorie Soggetti
Physycs, Mathematical","Phsycs, Fluid & Plasmas
ISSN journal
1063651X
Volume
55
Issue
6
Year of publication
1997
Part
B
Pages
7434 - 7440
Database
ISI
SICI code
1063-651X(1997)55:6<7434:FOTBP>2.0.ZU;2-Z
Abstract
We study numerically the properties of the Bayesian perception through a gradient descent on the optimal cost function. The theoretical dist ribution of stabilities is deduced. It predicts that the optimal gener alizer lies close to the boundary of the space of (error-free) solutio ns. The numerical simulations are in good agreement with the theoretic al distribution. The extrapolation of the generalization error to infi nite input space size agrees with the theoretical results. Finite size corrections are negative and exhibit two different scaling regimes, d epending on the training set size. The variance of the generalization error vanishes for N-->oo confirming the property of self-averaging.