A SELF-CALIBRATION TECHNIQUE FOR ACTIVE VISION SYSTEMS

Authors
Citation
S. Dema, A SELF-CALIBRATION TECHNIQUE FOR ACTIVE VISION SYSTEMS, IEEE transactions on robotics and automation, 12(1), 1996, pp. 114-120
Citations number
17
Categorie Soggetti
Computer Application, Chemistry & Engineering","Controlo Theory & Cybernetics","Robotics & Automatic Control","Engineering, Eletrical & Electronic
ISSN journal
1042296X
Volume
12
Issue
1
Year of publication
1996
Pages
114 - 120
Database
ISI
SICI code
1042-296X(1996)12:1<114:ASTFAV>2.0.ZU;2-I
Abstract
Many vision research groups have developed the active vision platform whereby the camera motion can be controlled. A similar setup is the wr ist-mounted camera for a robot manipulator. This head-eye (or hand-eye ) setup considerably facilitates motion stereo, object tracking, and a ctive perception. One of the important issues in using the active visi on system is to determine the camera position and orientation relative to the camera platform. This problem is called the head-eye calibrati on in active vision, and the hand-eye calibration in robotics. In this paper we present a new technique for calibrating the head-eye (or han d-eye) geometry as well as the camera intrinsic parameters. The techni que allows camera self-calibration because it requires no reference ob ject and directly uses the images of the environment. Camera self-cali bration is important especially in circumstances where the execution o f the underlying visual tasks does not permit the use of reference obj ects. Our method exploits the flexibility of the active vision system, and bases camera calibration on a sequence of specially designed moti on. It is shown that if the camera intrinsic parameters are known a pr iori, the orientation of the camera relative to the platform can be so lved using 3 pure translational motions. If the intrinsic parameters a re unknown, then two sequences of motion, each consisting of three ort hogonal translations, are necessary to determine the camera orientatio n and intrinsic parameters. Once the camera orientation and intrinsic parameters are determined, the position of the camera relative to the platform can be computed from an arbitrary nontranslational motion of the platform. All the computations in our method are linear. Experimen tal results with real images are presented in this paper.