This paper describes the MAGI (microscope-assisted guided interventions) au
gmented-reality system, which allows surgeons to view virtual features segm
ented from preoperative radiological images accurately overlaid in stereo i
n the optical path of a surgical microscope. The aim of the system is to en
able the surgeon to see in the correct 3-D position the structures that are
beneath the physical surface. The technical challenges involved are calibr
ation, segmentation, registration, tracking, and visualization. This paper
details our solutions to these problems. As it is difficult to make reliabl
e quantitative assessments of the accuracy of augmented-reality systems, re
sults are presented from a numerical simulation, and these show that the sy
stem has a theoretical overlay accuracy of better than 1 mm at the focal pl
ane of the microscope. Implementations of the system have been tested on vo
lunteers, phantoms, and seven patients in the operating room. Observations
are consistent with this accuracy prediction.