Multisensory information for human postural control: integrating touch andvision

Citation
J. Jeka et al., Multisensory information for human postural control: integrating touch andvision, EXP BRAIN R, 134(1), 2000, pp. 107-125
Citations number
45
Categorie Soggetti
Neurosciences & Behavoir
Journal title
EXPERIMENTAL BRAIN RESEARCH
ISSN journal
00144819 → ACNP
Volume
134
Issue
1
Year of publication
2000
Pages
107 - 125
Database
ISI
SICI code
0014-4819(200009)134:1<107:MIFHPC>2.0.ZU;2-M
Abstract
Despite extensive research on the influence of visual, vestibular and somat osensory information on human postural control, it remains unclear how thes e sensory channels are fused for self-orientation. The focus of the present study was to test whether a linear additive model could account for the fu sion of touch and vision for postural control. We simultaneously manipulate d visual and somatosensory (touch) stimuli in five conditions of single- an d multisensory stimulation. The visual stimulus was a display of random dot s projected onto a screen in front of the standing subject. The somatosenso ry stimulus was a rigid plate which subjects contacted lightly (<1 N of for ce) with their right index fingertip. In each condition, one sensory stimul us oscillated (dynamic) in the medial-lateral direction while the other sti mulus was either dynamic, static or absent. The results qualitatively suppo rted five predictions of the linear additive model in that the patterns of gain and variability across conditions were consistent with model predictio ns. However, a strict quantitative comparison revealed significant deviatio ns from model. predictions, indicating that the sensory fusion process clea rly has nonlinear aspects. We suggest that the sensory fusion process behav ed in an approximately linear fashion because the experimental paradigm tes ted postural control very close to the equilibrium point of vertical uprigh t.