A. Buhot et al., FINITE-SIZE-SCALING OF THE BAYESIAN PERCEPTRON, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics, 55(6), 1997, pp. 7434-7440
We study numerically the properties of the Bayesian perception through
a gradient descent on the optimal cost function. The theoretical dist
ribution of stabilities is deduced. It predicts that the optimal gener
alizer lies close to the boundary of the space of (error-free) solutio
ns. The numerical simulations are in good agreement with the theoretic
al distribution. The extrapolation of the generalization error to infi
nite input space size agrees with the theoretical results. Finite size
corrections are negative and exhibit two different scaling regimes, d
epending on the training set size. The variance of the generalization
error vanishes for N-->oo confirming the property of self-averaging.