S. Salzberg et al., BEST-CASE RESULTS FOR NEAREST-NEIGHBOR LEARNING, IEEE transactions on pattern analysis and machine intelligence, 17(6), 1995, pp. 599-608
In this paper we propose a theoretical model for analysis of classific
ation methods, in which the teacher knows the classification algorithm
and chooses examples in the best way possible. We apply this model us
ing the nearest-neighbor learning algorithm, and develop upper and low
er bounds on sample complexity for several different concept classes.
For some concept classes, the sample complexity turns out to be expone
ntial even using this best-case model, which implies that the concept
class is inherently difficult for the NN algorithm. We identify severa
l geometric properties that make learning certain concepts relatively
easy. Finally we discuss the relation of our work to helpful teacher m
odels, its application to decision tree learning algorithms, and some
of its implications for current experimental work.