E. Barnard, A MODEL FOR NONPOLYNOMIAL DECREASE IN ERROR RATE WITH INCREASING SAMPLE-SIZE, IEEE transactions on neural networks, 5(6), 1994, pp. 994-997
Much theoretical evidence exists for an inverse proportionality betwee
n the error rate of a classifier and the number of samples used to tra
in it. Cohn and Tesauro [1] have, however, discovered various problems
which experimentally display an approximately exponential decrease in
error rate. We present evidence that the observed exponential decreas
e is caused by the finite nature of the problems studied. A simple mod
el classification problem is presented, which demonstrates how the err
or rate approaches zero exponentially or faster when sufficiently many
training samples are used.