A NEURAL-NETWORK SOLUTION TO THE TRANSVERSE PATTERNING PROBLEM DEPENDS ON REPETITION OF THE INPUT CODE

Citation
Xb. Wu et al., A NEURAL-NETWORK SOLUTION TO THE TRANSVERSE PATTERNING PROBLEM DEPENDS ON REPETITION OF THE INPUT CODE, Biological cybernetics, 79(3), 1998, pp. 203-213
Citations number
41
Categorie Soggetti
Computer Science Cybernetics",Neurosciences
Journal title
ISSN journal
03401200
Volume
79
Issue
3
Year of publication
1998
Pages
203 - 213
Database
ISI
SICI code
0340-1200(1998)79:3<203:ANSTTT>2.0.ZU;2-8
Abstract
Using computer simulations, this paper investigates how input codes af fect a minimal computational model of the hippocampal region CA3. Beca use encoding context seems to be a function of the hippocampus, we hav e studied problems that require learning context for their solution. H ere we study a hippocampally dependent, configural learning problem ca lled transverse patterning. Previously, we showed that the network doe s not produce long local context codings when the sequential input pat terns are orthogonal, and it fails to solve many context-dependent pro blems in such situations. Here we show that this need not be the case if we assume that the input changes more slowly than a processing inte rval. Stuttering, i.e., repeating inputs, allows the network to create long local context firings even for orthogonal inputs. With these lon g local context firings, the network is able to solve the transverse p atterning problem. Without stuttering, transverse patterning is not le arned. Because stuttering is so useful, we investigate the relationshi p between the stuttering repetition length and relative context length in a simple, idealized sequence prediction problem. The :relative con text length, defined as the average length of the local context codes divided by the stuttering length, interacts with activity levels and h as an optimal stuttering repetition length. Moreover, the increase in average context length can reach this maximum without loss of relative capacity. Finally, we note that stuttering is an example of maintaine d or introduced redundancy that can improve neural computations.