Minimum generalization via reflection: A fast linear threshold learner

Citation
S. Hampson et D. Kibler, Minimum generalization via reflection: A fast linear threshold learner, MACH LEARN, 37(1), 1999, pp. 51-73
Citations number
23
Categorie Soggetti
AI Robotics and Automatic Control
Journal title
MACHINE LEARNING
ISSN journal
08856125 → ACNP
Volume
37
Issue
1
Year of publication
1999
Pages
51 - 73
Database
ISI
SICI code
0885-6125(199910)37:1<51:MGVRAF>2.0.ZU;2-2
Abstract
The number of adjustments required to learn the average LTU function of d f eatures, each of which can take on n equally spaced values, grows as approx imately n(2d) when the standard perceptron training algorithm is used on th e complete input space of n points and perfect classification is required. We demonstrate a simple modification that reduces the observed growth rate in the number of adjustments to approximately d(2)(log (d) + log(n)) with m ost, but not all input presentation orders. A similar speed-up is also prod uced by applying the simple but computationally expensive heuristic "don't overgeneralize" to the standard training algorithm. This performance is ver y close to the theoretical optimum for learning LTU functions by any method , and is evidence that perceptron-like learning algorithms can learn arbitr ary LTU functions in polynomial, rather than exponential time under normal training conditions. Similar modifications can be applied to the Winnow alg orithm, achieving similar performance improvements and demonstrating the ge nerality of the approach.