P. Ienne et al., MODIFIED SELF-ORGANIZING FEATURE MAP ALGORITHMS FOR EFFICIENT DIGITALHARDWARE IMPLEMENTATION, IEEE transactions on neural networks, 8(2), 1997, pp. 315-330
This paper describes two variants of the Kohonen's self-organizing fea
ture map (SOFM) algorithm, Both variants update the weights only after
presentation of a group of input vectors, In contrast, in the origina
l algorithm the weights are updated after presentation of every input
vector, The main advantage of these variants is to make available a fi
ner grain of parallelism, for implementation on machines with a very l
arge number of processors, without compromising the desired properties
of the algorithm, In this work it is proved that, for one-dimensional
(1-D) maps and 1-D continuous input and weight spaces, the strictly i
ncreasing or decreasing weight configuration forms an absorbing class
in both variants, exactly as in the original algorithm, Ordering of th
e maps and convergence to asymptotic values are also proved, again con
firming the theoretical results obtained for the original algorithm, S
imulations of a real-world application using two-dimensional (2-D) map
s on 12-D speech data are presented to back up the theoretical results
and show that the performance of one of the variants Is in all respec
ts almost as good as the original algorithm, Finally, the practical ut
ility of the finer parallelism made available is confirmed by the desc
ription of a massively parallel hardware system that makes effective u
se of the best variant.