COMPARATIVE-STUDY OF SEARCHING METHOD IN RECURRENT NEURAL NETWORKS

Citation
K. Yokoi et al., COMPARATIVE-STUDY OF SEARCHING METHOD IN RECURRENT NEURAL NETWORKS, Systems and computers in Japan, 27(12), 1996, pp. 22-32
Citations number
17
Categorie Soggetti
Computer Science Hardware & Architecture","Computer Science Information Systems","Computer Science Theory & Methods
ISSN journal
08821666
Volume
27
Issue
12
Year of publication
1996
Pages
22 - 32
Database
ISI
SICI code
0882-1666(1996)27:12<22:COSMIR>2.0.ZU;2-2
Abstract
The off-line learning with a supervisor in the recurrent neural networ k can be considered as the minimum search on the surface (called learn ing surface) formed by the evaluation functions. The learning surface in the search has a characteristic shape, and the authors are proposin g the valley searching method, based on the global shape of the learni ng surface. As the search for the minimum, various methods, such as th e steepest descent (gradient) and the conjugate gradient methods, ofte n are employed. The relation between the search method and the charact eristic shape of the learning surface has not been clarified and it ha s been difficult to assess quantitatively the usefulness of the search ing methods. From such a viewpoint, this paper considers three methods , i.e., the method of steepest descent, the conjugate gradient method, and the valley searching method, and analyzes the relation between th e searching method and the learning surface. It is experimentally show n based on the result of simulation that the valley searching method i s better. More precisely, the parameter dependencies of the searching method on the learning curve and the searching process are examined ex perimentally, and it is shown that the parameters are set more easily in the valley searching method than in other methods. It is also shown that the valley searching method is better also in terms of the conve rgence time and the dependency on the initial value.