Multisensory visual servoing by a neural network

Citation
Gq. Wei et G. Hirzinger, Multisensory visual servoing by a neural network, IEEE SYST B, 29(2), 1999, pp. 276-280
Citations number
18
Categorie Soggetti
AI Robotics and Automatic Control
Journal title
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS
ISSN journal
10834419 → ACNP
Volume
29
Issue
2
Year of publication
1999
Pages
276 - 280
Database
ISI
SICI code
1083-4419(199904)29:2<276:MVSBAN>2.0.ZU;2-C
Abstract
Conventional computer vision methods for determining a robot's end-effector motion based on sensory data needs sensor calibration (e.g,, camera calibr ation) and sensor-to-hand calibration (e.g., hand-eye calibration). This in volves many computations and even some difficulties, especially when differ ent kinds of sensors are involved. In this correspondence, we present a neu ral network approach to the motion determination problem without any calibr ation. Two kinds of sensory data, namely, camera images and laser range dat a, are used as the input to a multilayer feedforward network to associate t he direct transformation from the sensory data to the required motions. Thi s provides a practical sensor fusion method. Using a recursive motion strat egy and in terms of a network correction, we relax the requirement for the exactness of the learned transformation. Another important feature of our w ork is that the goal position can be changed without having to do network r etraining. Experimental results show the effectiveness of our method.