EXPERIMENTAL STUDIES OF MEMORY SURFACES AND LEARNING SURFACES IN RECURRENT NEURAL NETWORKS

Citation
T. Watanabe et al., EXPERIMENTAL STUDIES OF MEMORY SURFACES AND LEARNING SURFACES IN RECURRENT NEURAL NETWORKS, Systems and computers in Japan, 25(8), 1994, pp. 27-39
Citations number
10
Categorie Soggetti
Computer Science Hardware & Architecture","Computer Science Information Systems","Computer Science Theory & Methods
ISSN journal
08821666
Volume
25
Issue
8
Year of publication
1994
Pages
27 - 39
Database
ISI
SICI code
0882-1666(1994)25:8<27:ESOMSA>2.0.ZU;2-D
Abstract
The following approach is considered to be important for an insight in to the ''learning'' in the layered network. The adjustable parameter v ector w satisfying the required input-output relation of the network a lso is considered. Then the ''memory surface'' in w space formed by th e set of solutions for the nonlinear equation for w should be investig ated. It is shown also that the idea of the ''memory surface'' can be extended to the learning of the recurrent network (abbreviated as RNN) with a supervisor. This paper shows the following property by an expe riment. When the evaluation function E is defined by the square integr al of the error, then the behavior of RNN agrees essentially with the results obtained for the case of the layered network in the following four points: (1) there exists the memory surface in RNN in the same wa y as in the layered network; (2) the supervised learning of RNN amount s geometrically to the search in the intersection region of multiple ' 'memory surfaces'' for the global or local minimum of E on the learnin g surface formed in w-E space with w as the variable; (3) the feature of the learning surface of RNN is that the space is composed of a long ''valley'' and a flat ''hill'' with smaller gradient; and (4) the lea rning process by the successive search based on the gradient vector of E tends to progress near the memory surface along the valley.