We have designed a research platform for a perceptually guided robot, which
also serves as a demonstrator for a coming generation of service robots. I
n order to operate semi-autonomously, these require a capacity for learning
about their environment and tasks, and will have to interact directly with
their human operators. Thus, they must be supplied with skills in the fiel
ds of human-computer interaction, vision, and manipulation. GripSee is able
to autonomously grasp and manipulate objects on a table in front of it. Th
e choice of object, the grip to be used, and the desired final position are
indicated by an operator using hand gestures. Grasping is performed simila
r to human behavior: the object is first fixated, then its form, size, orie
ntation, and position are determined, a grip is planned, and finally the ob
ject is grasped, moved to a new position, and released. As a final example
for useful autonomous behavior we show how the calibration of the robot's i
mage-to-world coordinate transform can be learned from experience, thus mak
ing detailed and unstable calibration of this important subsystem superfluo
us. The integration concepts developed at our institute have led to a flexi
ble library of robot skills that can be easily recombined for a variety of
useful behaviors.