monitoring of large sites requires coordination between multiple cameras. w
hich in turn requires methods for relating events between distributed camer
as. This paper tackles the problem of automatic external calibration of mul
tiple cameras in an extended scene, that is, full recovery of their 3D rela
tive positions and orientations. Because the cameras are placed far apart,
brightness or proximity constraints cannot be used to match static features
, so we instead apply planar geometric constraints to moving objects tracke
d throughout the scene. By robustly matching and fitting tracked objects to
a planar model, we align the scene's ground plane across multiple views an
d decompose the planar alignment matrix to recover the 3D relative camera a
nd ground plane positions. We demonstrate this technique in both a controll
ed lab setting where we test the effects of errors in the intrinsic camera
parameters. and in an uncontrolled, outdoor setting. In the latter, we do n
ot assume synchronized cameras and we show that enforcing geometric constra
ints enables us to align the tracking data in time. In spite of noise in th
e intrinsic camera parameters and in the image data. the system successfull
y transforms multiple views of the scene's ground plane to an overhead view
and recovers the relative 3D camera and ground plane positions.