STRONG MINIMAX LOWER BOUNDS FOR LEARNING

Authors
Citation
A. Antos et G. Lugosi, STRONG MINIMAX LOWER BOUNDS FOR LEARNING, Machine learning, 30(1), 1998, pp. 31-56
Citations number
12
Categorie Soggetti
Computer Science Artificial Intelligence","Computer Science Artificial Intelligence
Journal title
ISSN journal
08856125
Volume
30
Issue
1
Year of publication
1998
Pages
31 - 56
Database
ISI
SICI code
0885-6125(1998)30:1<31:SMLBFL>2.0.ZU;2-7
Abstract
Minimax lower bounds for concept learning state, for example, that for each sample size n and learning rule g(n), there exists a distributio n of the observation X and a concept C to be learnt such that the expe cted error of g(n) is at least a constant times V/n, where V is the VC dimension of the concept class. However, these bounds do not tell any thing about the rate of decrease of the error for a fixed distribution -concept pair. In this paper we investigate minimax lower bounds in su ch a (stronger) sense. We show that for several natural k-parameter co ncept classes, including the class of linear halfspaces, the class of balls, the class of polyhedra with a certain number of faces, and a cl ass of neural networks, for any sequence of learning rules {g(n)}, the re exists a fixed distribution of X and a fixed concept C such that th e expected error is larger than a constant times k/n for infinitely ma ny n. We also obtain such strong minimax lower bounds for the tail dis tribution of the probability of error, which extend the corresponding minimax lower bounds.