A MODEL FOR NONPOLYNOMIAL DECREASE IN ERROR RATE WITH INCREASING SAMPLE-SIZE

Authors
Citation
E. Barnard, A MODEL FOR NONPOLYNOMIAL DECREASE IN ERROR RATE WITH INCREASING SAMPLE-SIZE, IEEE transactions on neural networks, 5(6), 1994, pp. 994-997
Citations number
5
Categorie Soggetti
Computer Application, Chemistry & Engineering","Engineering, Eletrical & Electronic","Computer Science Artificial Intelligence","Computer Science Hardware & Architecture","Computer Science Theory & Methods
ISSN journal
10459227
Volume
5
Issue
6
Year of publication
1994
Pages
994 - 997
Database
ISI
SICI code
1045-9227(1994)5:6<994:AMFNDI>2.0.ZU;2-L
Abstract
Much theoretical evidence exists for an inverse proportionality betwee n the error rate of a classifier and the number of samples used to tra in it. Cohn and Tesauro [1] have, however, discovered various problems which experimentally display an approximately exponential decrease in error rate. We present evidence that the observed exponential decreas e is caused by the finite nature of the problems studied. A simple mod el classification problem is presented, which demonstrates how the err or rate approaches zero exponentially or faster when sufficiently many training samples are used.