OCULO-MOTOR STABILIZATION REFLEXES - INTEGRATION OF INERTIAL AND VISUAL INFORMATION

Citation
F. Panerai et G. Sandini, OCULO-MOTOR STABILIZATION REFLEXES - INTEGRATION OF INERTIAL AND VISUAL INFORMATION, Neural networks, 11(7-8), 1998, pp. 1191-1204
Citations number
38
Categorie Soggetti
Computer Science Artificial Intelligence","Computer Science Artificial Intelligence
Journal title
ISSN journal
08936080
Volume
11
Issue
7-8
Year of publication
1998
Pages
1191 - 1204
Database
ISI
SICI code
0893-6080(1998)11:7-8<1191:OSR-IO>2.0.ZU;2-W
Abstract
Stabilization of gaze is a fundamental requirement of an active visual system for at least two reasons: (i) to increase the robustness of dy namic visual measures during observer's motion; (ii) to provide a refe rence with respect to the environment (Ballard and Brown, 1992). The a im of this paper is to address the former issue by investigating the r ole of integration of visuo-inertial information in gaze stabilization . The rationale comes from observations of how the stabilization probl em is solved in biological systems and experimental results based on a n artificial visual system equipped with space-variant visual sensors and an inertial sensor are presented. In particular the following issu es are discussed: (i) the relations between eye-head geometry, fixatio n distance and stabilization performance; (ii) the computational requi rements of the visuo-inertial stabilization approach compared to a vis ual stabilization approach; (iii) the evaluation of performance of the visuo-inertial strategy in a real-time monocular stabilization task. Experiments are performed to quantitatively describe the performance o f the system with respect to different choices of the principal parame ters. The results show that the integrated approach is indeed valuable : it makes use of visual computational resources more efficiently, ext ends the range of motions or external disturbances the system can effe ctively deal with, and reduces system complexity. (C) 1998 Elsevier Sc ience Ltd. All rights reserved.