LEARNING ALGORITHMS USING FIRING NUMBERS OF WEIGHT VECTORS FOR WTA NETWORKS IN ROTATION-INVARIANT PATTERN-CLASSIFICATION

Citation
S. Ren et al., LEARNING ALGORITHMS USING FIRING NUMBERS OF WEIGHT VECTORS FOR WTA NETWORKS IN ROTATION-INVARIANT PATTERN-CLASSIFICATION, IEICE transactions on fundamentals of electronics, communications and computer science, E81A(1), 1998, pp. 175-182
Citations number
17
Categorie Soggetti
Engineering, Eletrical & Electronic","Computer Science Hardware & Architecture","Computer Science Information Systems
ISSN journal
09168508
Volume
E81A
Issue
1
Year of publication
1998
Pages
175 - 182
Database
ISI
SICI code
0916-8508(1998)E81A:1<175:LAUFNO>2.0.ZU;2-G
Abstract
This paper focuses on competitive learning algorithms for WTA (winner- take-all) networks which perform rotation invariant pattern classifica tion. Although WTA networks may theoretically be possible to achieve r otation invariant pattern classification with infinite memory capaciti es, actual networks cannot memorize all input data. To effectively mem orize input patterns or the vectors to be classified, we present two a lgorithms for learning vectors in classes (LVC1 and LVC2), where the c ells in the network memorize not only weight vectors but also their fi ring numbers as statistical values of the vectors. The LVC1 algorithm uses simple and ordinary competitive learning functions, but it incorp orates the firing number into a coefficient of the weight change equat ion. In addition to all the functions of the LVC1, the LVC2 algorithm has a function to utilize underutilized weight vectors. From theoretic al analysis, the LVC2 algorithm works to minimize the energy of all we ight vectors to form an effective memory. From computer simulation wit h two-dimensional rotated patterns, the LVC2 is shown to be better tha n the LVC1 in learning and generalization abilities, and both are bett er than the conventional Kohonen self-organizing Feature map (SOFM) an d the learning vector quantization(LVQ1). Fur thermore, the incorporat ion of the firing number into the weight change equation is shown to b e efficient for both the LVC1 and the LVC2 to achieve higher learning and generalization abilities. The theoretical analysis given here is n ot only for rotation invariant pattern classification, but it is also applicable to other WTA networks for learning vector quantization.