AN OPTIMAL PARALLEL PERCEPTRON LEARNING ALGORITHM FOR A LARGE TRAINING SET

Authors
Citation
Tp. Hong et Ss. Tseng, AN OPTIMAL PARALLEL PERCEPTRON LEARNING ALGORITHM FOR A LARGE TRAINING SET, Parallel computing, 20(3), 1994, pp. 347-352
Citations number
3
Categorie Soggetti
Computer Sciences","Computer Science Theory & Methods
Journal title
ISSN journal
01678191
Volume
20
Issue
3
Year of publication
1994
Pages
347 - 352
Database
ISI
SICI code
0167-8191(1994)20:3<347:AOPPLA>2.0.ZU;2-2
Abstract
In [2], a parallel perceptron learning algorithm on the single-channel broadcast communication model was proposed to speed up the learning o f weights of perceptrons [3]. The results in [2] showed that given n t raining examples, the average speedup is 1.48n0.91/log n by n process ors. Here, we explain how the parallelization may be modified so that it is applicable to any number of processors. Both analytical and expe rimental results show that the average speedup can reach nearly O(r) b y r processors if r is much less than n.