Efficient perceptron learning using constrained steepest descent

Citation
Sj. Perantonis et V. Virvilis, Efficient perceptron learning using constrained steepest descent, NEURAL NETW, 13(3), 2000, pp. 351-364
Citations number
25
Categorie Soggetti
AI Robotics and Automatic Control
Journal title
NEURAL NETWORKS
ISSN journal
08936080 → ACNP
Volume
13
Issue
3
Year of publication
2000
Pages
351 - 364
Database
ISI
SICI code
0893-6080(200004)13:3<351:EPLUCS>2.0.ZU;2-D
Abstract
An algorithm is proposed for training the single-layered perceptron. The al gorithm follows successive steepest descent directions with respect to the perceptron cost function, taking care not to increase the number of misclas sified patterns. The problem of finding these directions is stated as a qua dratic programming task, to which a fast and effective solution is proposed . The resulting algorithm has no free parameters and therefore no heuristic s are involved in its application. It is proved that the algorithm always c onverges in a finite number of steps. For linearly separable problems, it a lways finds a hyperplane that completely separates patterns belonging to di fferent categories. Termination of the algorithm without separating all giv en patterns means that the presented set of patterns is indeed linearly ins eparable. Thus the algorithm provides a natural criterion for linear separa bility. Compared to other state of the art algorithms, the proposed method exhibits substantially improved speed, as demonstrated in a number of deman ding benchmark classification tasks. (C) 2000 Published by Elsevier Science Ltd. All rights reserved.