GROWING RADIAL BASIS NEURAL NETWORKS - MERGING SUPERVISED AND UNSUPERVISED LEARNING WITH NETWORK GROWTH TECHNIQUES

Citation
Nb. Karayiannis et Gwq. Mi, GROWING RADIAL BASIS NEURAL NETWORKS - MERGING SUPERVISED AND UNSUPERVISED LEARNING WITH NETWORK GROWTH TECHNIQUES, IEEE transactions on neural networks, 8(6), 1997, pp. 1492-1506
Citations number
26
Categorie Soggetti
Computer Application, Chemistry & Engineering","Engineering, Eletrical & Electronic","Computer Science Artificial Intelligence","Computer Science Hardware & Architecture","Computer Science Theory & Methods
ISSN journal
10459227
Volume
8
Issue
6
Year of publication
1997
Pages
1492 - 1506
Database
ISI
SICI code
1045-9227(1997)8:6<1492:GRBNN->2.0.ZU;2-#
Abstract
This paper proposes a framework for constructing and training radial b asis function (RBF) neural networks. proposed growing radial basis fun ction (GRBF) network begins with a small number of prototypes which de termine the locations of radial basis functions, In the process of tra ining, the GRBF network grows by splitting one of the prototypes at ea ch growing cycle, Two splitting criteria are proposed to determine whi ch prototype to split in each growing cycle, The proposed hybrid learn ing scheme provides a framework for incorporating existing algorithms in the training of GRBF networks, These include unsupervised algorithm s for clustering and learning vector quantization, as well as learning algorithms for training single-layer Linear neural networks, A superv ised learning scheme based the minimization of the localized class-con ditional variance also proposed and tested, GRBF neural networks are e valuated and tested an a variety of data sets,vith very satisfactory r esults.