An algorithm is proposed for training the single-layered perceptron. The al
gorithm follows successive steepest descent directions with respect to the
perceptron cost function, taking care not to increase the number of misclas
sified patterns. The problem of finding these directions is stated as a qua
dratic programming task, to which a fast and effective solution is proposed
. The resulting algorithm has no free parameters and therefore no heuristic
s are involved in its application. It is proved that the algorithm always c
onverges in a finite number of steps. For linearly separable problems, it a
lways finds a hyperplane that completely separates patterns belonging to di
fferent categories. Termination of the algorithm without separating all giv
en patterns means that the presented set of patterns is indeed linearly ins
eparable. Thus the algorithm provides a natural criterion for linear separa
bility. Compared to other state of the art algorithms, the proposed method
exhibits substantially improved speed, as demonstrated in a number of deman
ding benchmark classification tasks. (C) 2000 Published by Elsevier Science
Ltd. All rights reserved.