A NEURAL-NETWORK ARCHITECTURE FOR INCREMENTAL LEARNING

Citation
S. Shiotani et al., A NEURAL-NETWORK ARCHITECTURE FOR INCREMENTAL LEARNING, Neurocomputing, 9(2), 1995, pp. 111-130
Citations number
14
Categorie Soggetti
Computer Sciences, Special Topics","Computer Science Artificial Intelligence",Neurosciences
Journal title
ISSN journal
09252312
Volume
9
Issue
2
Year of publication
1995
Pages
111 - 130
Database
ISI
SICI code
0925-2312(1995)9:2<111:ANAFIL>2.0.ZU;2-D
Abstract
Artificial neural networks have been used as a tool for category class ification. The neural network can correctly classify patterns which ha ve already been trained. However, sometimes the neural network erroneo usly classifies patterns which have never been trained. The neural net work must learn again to correct the errors. In this learning, the mul ti-layered perceptron (MLP) must learn new patterns and old patterns. The new pattern is the pattern which the MLP cannot classify correctly and the old pattern is the pattern which the MLP has already learned. So, the MLP is ineffective in computing cost due to learning the old patterns. The adaptive resonance theory (ART) model can memorize the n ew patterns without learning the old patterns due to incremental learn ing. However, it has problems with classification ability. This paper proposes a neural network architecture for incremental learning. This neural network is called 'Neural network based on Distance between Pat terns' (NDP). The NDP has a two-layered hierarchical structure and man y neurons of the radial basis function in the output layer. The NDP pe rforms incremental learning which increases neurons in the output laye r and varies the center and the gradient of the radial basis function. So, the NDP can memorize the new patterns without learning the old pa tterns and has superior classification ability. The NDP differs from c onventional radial basis function neural networks in the area of incre mental learning. In addition, this paper shows the effectiveness of th e NDP in experiments on image recognition.