EYE-IN-HAND ROBOTIC TASKS IN UNCALIBRATED ENVIRONMENTS

Citation
Ce. Smith et al., EYE-IN-HAND ROBOTIC TASKS IN UNCALIBRATED ENVIRONMENTS, IEEE transactions on robotics and automation, 13(6), 1997, pp. 903-914
Citations number
36
ISSN journal
1042296X
Volume
13
Issue
6
Year of publication
1997
Pages
903 - 914
Database
ISI
SICI code
1042-296X(1997)13:6<903:ERTIUE>2.0.ZU;2-Q
Abstract
Flexible operation of a robotic agent in an uncalibrated environment r equires the ability to recover unknown or partially known parameters o f the workspace through sensing. Of the sensors available to a robotic agent, visual sensors provide information that is richer and more com plete than other sensors. In this paper we present robust techniques f or the derivation of depth from feature points on a target's surface a nd for the accurate and high-speed tracking of moving targets. We use these techniques in a system that operates with little or no a priori knowledge of object-and camera-related parameters to robustly determin e such object-related parameters as velocity and depth. Such determina tion of extrinsic environmental parameters is essential for performing higher level tasks such as inspection, exploration, tracking, graspin g, and collision-free motion planning. For both applications, we use t he Minnesota robotic visual tracker (MRVT) (a single visual sensor mou nted on the end-effector of a robotic manipulator combined with a real -time vision system) to automatically select feature points on surface s, to derive an estimate of the environmental parameter in question, a nd to supply a control vector based upon these estimates to guide the manipulator.