In this paper we present methods for tracking complex, articulated objects,
We assume that an appearance model and the kinematic structure of the obje
ct to be tracked are given, leading to what is termed a model-based object
tracker. At each time step, this tracker observes a new monocular grayscale
image of the scene and combines information gathered from this image with
knowledge of the previous configuration of the object to estimate the confi
guration of the object at the time the image was acquired. Each degree of f
reedom in the model has an uncertainty associated with it, indicating the c
onfidence in the current estimate for that degree of freedom. These uncerta
inty estimates are updated after each observation. An extended Kalman filte
r with appropriate observation and system models is used to implement this
updating process. The methods that we describe are potentially beneficial t
o areas such as automated visual tracking in general, visual servo control,
and human computer interaction.