Influence of head position on the spatial representation of acoustic targets

Citation
Hhlm. Goossens et Aj. Van Opstal, Influence of head position on the spatial representation of acoustic targets, J NEUROPHYS, 81(6), 1999, pp. 2720-2736
Citations number
49
Categorie Soggetti
Neurosciences & Behavoir
Journal title
JOURNAL OF NEUROPHYSIOLOGY
ISSN journal
00223077 → ACNP
Volume
81
Issue
6
Year of publication
1999
Pages
2720 - 2736
Database
ISI
SICI code
0022-3077(199906)81:6<2720:IOHPOT>2.0.ZU;2-#
Abstract
Influence of head position on the spatial representation of acoustic target s. J. Neurophysiol. 81:2720-2736, 1999. Sound localization in humans relies on binaural differences (azimuth cues) and monaural spectral shape informa tion (elevation cues) and is therefore the result of a neural computational process. Despite the fact that these acoustic cues are referenced with res pect to the head, accurate eye movements can be generated to sounds in comp lete darkness, This ability necessitates the use of eye position informatio n. So far, however, sound localization has been investigated mainly with a fixed head position, usually straight ahead. Yet the auditory system may re ly on head motor information to maintain a stable and spatially accurate re presentation of acoustic targets in the presence of head movements. We ther efore studied the influence of changes in eye-head position on auditory-gui ded orienting behavior of human subjects. In the first experiment, we used a visual-auditory double-step paradigm. Subjects made saccadic gaze shifts in total darkness toward brief broadband sounds presented before an interve ning eye-head movement that was evoked by an earlier visual target. The dat a show that the preceding displacements of both eye and head are fully acco unted for, resulting in spatially accurate responses. This suggests that au ditory target information may be transformed into a spatial (or body-center ed) frame of reference. To further investigate this possibility, we exploit ed; the unique property of the auditory system that sound elevation is extr acted independently from pinna-related spectral cues. In the absence of suc h cues, accurate elevation detection is not possible, even when head moveme nts an made. This is shown in a second experiment where pure tones were loc alized at a fixed elevation that depended on the tone frequency rather than on the actual target elevation. both under head-fixed and -free conditions . To test, in a third experiment, whether the perceived elevation of tones relies on a head- or space-fixed target representation, eye movements were elicited toward pure tones while subjects kept their head in different vert ical positions, It appeared that each tone was localized at a fixed, freque ncy-dependent elevation in space that shifted to a limited extent with chan ges in head elevation. Hence information about head position is used under static conditions too. Interestingly, the influence of head position also d epended on the tone frequency. Thus tone-evoked ocular saccades typically s howed a partial compensation for changes in static head position, whereas n oise-evoked eye-head saccades fully compensated for intervening changes in eye-head position. We propose that the auditory localization system combine s the acoustic input with head-position information to encode targets in a spatial (or body-centered) frame of reference. In this way, accurate orient ing responses may be programmed despite intervening eye-head movements. A c onceptual model, based on the tonotopic organization of the auditory system , is presented that may account for our findings.