AN ANALYSIS OF THE GAMMA-MEMORY IN DYNAMIC NEURAL NETWORKS

Citation
Jc. Principe et al., AN ANALYSIS OF THE GAMMA-MEMORY IN DYNAMIC NEURAL NETWORKS, IEEE transactions on neural networks, 5(2), 1994, pp. 331-337
Citations number
20
Categorie Soggetti
Computer Application, Chemistry & Engineering","Engineering, Eletrical & Electronic","Computer Science Artificial Intelligence","Computer Science Hardware & Architecture","Computer Science Theory & Methods
ISSN journal
10459227
Volume
5
Issue
2
Year of publication
1994
Pages
331 - 337
Database
ISI
SICI code
1045-9227(1994)5:2<331:AAOTGI>2.0.ZU;2-L
Abstract
In this paper we present a vector space framework to study short-term memory filters in dynamic neural networks. We define parameters to qua ntify the function of feedforward and recursive linear memory filters. We show, using vector spaces, what is the optimization problem solved by the PEs of the first hidden layer of the single input focused netw ork architecture. Due to the special properties of the gamma bases, re cursion brings an extra parameter lambda (the time constant of the lea ky integrator) that displaces the memory manifold towards the desired signal when the mean square error is minimized. In contrast, for the f eedforward memory filter the angle between the desired signal and the memory manifold is fixed for a given memory order. The adaptation of t he feedback parameter can be done using gradient descent, but the opti mization is nonconvex.