MONOCULAR MECHANISMS DETERMINE PLAID MOTION COHERENCE

Citation
D. Alais et al., MONOCULAR MECHANISMS DETERMINE PLAID MOTION COHERENCE, Visual neuroscience, 13(4), 1996, pp. 615-626
Citations number
45
Categorie Soggetti
Neurosciences
Journal title
ISSN journal
09525238
Volume
13
Issue
4
Year of publication
1996
Pages
615 - 626
Database
ISI
SICI code
0952-5238(1996)13:4<615:MMDPMC>2.0.ZU;2-D
Abstract
Although the neural location of the plaid motion coherence process is not precisely known, the middle temporal (MT) cortical area has been p roposed as a likely candidate. This claim rests largely on the neuroph ysiological findings showing that in response to plaid stimuli, a subg roup of cells in area MT responds to the pattern direction, whereas ce lls in area V1 respond only to the directions of the component grating s. In Experiment 1, we report that the coherent motion of a plaid patt ern can be completely abolished following adaptation to a grating whic h moves in the plaid direction and has the same spatial period as the plaid features (the so-called ''blobs''). Interestingly, we find this phenomenon is monocular: monocular adaptation destroys plaid coherence in the exposed eye but leaves it unaffected in the other eye. Experim ent 2 demonstrates that adaptation to a purely binocular (dichoptic) g rating does not affect perceived plaid coherence. These data suggest s everal conclusions: (1) that the mechanism determining plaid coherence responds to the motion of plaid features, (2) that the coherence mech anism is monocular, and thus (3), that it is probably located at a rel atively low level in the visual system and peripherally to the binocul ar mechanisms commonly presumed to underlie two-dimensional (2-D) moti on perception. Experiment 3 examines the spatial tuning of the monocul ar coherence mechanism and our results suggest it is broadly tuned wit h a preference for lower spatial frequencies. In Experiment 4, we exam ine whether perceived plaid direction is determined by the motion of t he grating components or the features. Our data strongly support a fea ture-based model.