The off-line learning with a supervisor in the recurrent neural networ
k can be considered as the minimum search on the surface (called learn
ing surface) formed by the evaluation functions. The learning surface
in the search has a characteristic shape, and the authors are proposin
g the valley searching method, based on the global shape of the learni
ng surface. As the search for the minimum, various methods, such as th
e steepest descent (gradient) and the conjugate gradient methods, ofte
n are employed. The relation between the search method and the charact
eristic shape of the learning surface has not been clarified and it ha
s been difficult to assess quantitatively the usefulness of the search
ing methods. From such a viewpoint, this paper considers three methods
, i.e., the method of steepest descent, the conjugate gradient method,
and the valley searching method, and analyzes the relation between th
e searching method and the learning surface. It is experimentally show
n based on the result of simulation that the valley searching method i
s better. More precisely, the parameter dependencies of the searching
method on the learning curve and the searching process are examined ex
perimentally, and it is shown that the parameters are set more easily
in the valley searching method than in other methods. It is also shown
that the valley searching method is better also in terms of the conve
rgence time and the dependency on the initial value.