Auditory-visual integration during multimodal object recognition in humans: A behavioral and electrophysiological study

Citation
Mh. Giard et F. Peronnet, Auditory-visual integration during multimodal object recognition in humans: A behavioral and electrophysiological study, J COGN NEUR, 11(5), 1999, pp. 473-490
Citations number
86
Categorie Soggetti
Neurosciences & Behavoir
Journal title
JOURNAL OF COGNITIVE NEUROSCIENCE
ISSN journal
0898929X → ACNP
Volume
11
Issue
5
Year of publication
1999
Pages
473 - 490
Database
ISI
SICI code
0898-929X(199909)11:5<473:AIDMOR>2.0.ZU;2-O
Abstract
The aim of this study was (1) to provide behavioral evidence for multimodal feature integration in an object recognition task in humans and (2) to cha racterize the processing stages and the neural structures where multisensor y interactions take place. Event-related potentials (ERPs) were recorded fr om 30 scalp electrodes while subjects performed a forced-choice reaction-ti me categorization task:At each trial, the subjects had to indicate which of two objects was presented by pressing one of two keys. The two objects wer e defined by auditory features alone, visual features alone, or the combina tion of auditory and visual features. Subjects were more accurate and rapid at identifying multimodal than unimodal objects. Spatiotemporal analysis o f ERPs and scalp current densities revealed several auditory-visual interac tion components temporally, spatially, and functionally distinct before 200 msec poststimulus. The effects observed were (1) in visual areas, new neur al activities (as early as 40 msec poststimulus) and modulation (amplitude decrease) of the N185 wave to unimodal visual stimulus, (2) in the auditory cortex, modulation (amplitude increase) of subcomponents of the unimodal a uditory N1 wave around 90 to 110 msec, and (3) new neural activity over thr right fronto-temporal area (140 to 165 msec). Furthermore, when the subjec ts were separated into two groups according to their dominant modality to p erform the task in unimodal conditions (shortest reaction time criteria), t he integration effects were found to be similar for the two groups over the nonspecific fronto-temporal areas, but they clearly differed in the sensor y-specific cortices, affecting predominantly the sensory areas of the nondo minant modality. Taken together, the results indicate that multisensory int egration is mediated by flexible, highly adaptive physiological processes t hat can take place very early in the sensory processing chain and operate i n both sensory-specific and nonspecific cortical structures in different wa ys.