A NONLINEAR HEBBIAN NETWORK THAT LEARNS TO DETECT DISPARITY IN RANDOM-DOT STEREOGRAMS

Citation
Cw. Lee et Ba. Olshausen, A NONLINEAR HEBBIAN NETWORK THAT LEARNS TO DETECT DISPARITY IN RANDOM-DOT STEREOGRAMS, Neural computation, 8(3), 1996, pp. 545-566
Citations number
33
Categorie Soggetti
Computer Sciences","Computer Science Artificial Intelligence",Neurosciences
Journal title
ISSN journal
08997667
Volume
8
Issue
3
Year of publication
1996
Pages
545 - 566
Database
ISI
SICI code
0899-7667(1996)8:3<545:ANHNTL>2.0.ZU;2-6
Abstract
An intrinsic limitation of linear, Hebbian networks is that they are c apable of learning only from the linear pairwise correlations within a n input stream. To explore what higher forms of structure could be lea rned with a nonlinear Hebbian network, we constructed a model network containing a simple form of nonlinearity and we applied it to the prob lem of learning to detect the disparities present in random-dot stereo grams. The network consists of three layers, with nonlinear sigmoidal activation functions in the second-layer units. The nonlinearities all ow the second layer to transform the pixel-based representation in the input layer into a new representation based on coupled pairs of left- right inputs. The third layer of the network then clusters patterns oc curring on the second-layer outputs according to their disparity via a standard competitive learning rule. Analysis of the network dynamics shows that the second-layer units' nonlinearities interact with the He bbian learning rule to expand the region over which pairs of left-righ t inputs are stable. The learning rule is neurobiologically inspired a nd plausible, and the model may shed light on how the nervous system l earns to use coincidence detection in general.