Photometric reconstruction is the process of estimating the illumination an
d surface reflectance properties of an environment, given a geometric model
of the scene and a set of photographs of its surfaces. For mixed-reality a
pplications, such data is required if synthetic objects are to be correctly
illuminated or if synthetic light sources are to be used to re-light the s
cene. Current methods of estimating such data are limited in the practical
situations in which they can be applied, due to the fact that the geometric
and radiometric models of the scene which are provided by the user must be
complete, and that the position (and in some cases, intensity) of the ligh
t sources must also be specified a-priori. In this paper a novel algorithm
is presented which overcomes these constraints, and allows photometric data
to be reconstructed in less restricted situations. This is achieved throug
h the use of virtual light sources which mimic the effect of direct illumin
ation from unknown luminaires, and indirect illumination reflected off unkn
own geometry. The intensify of these virtual light sources and the surface
material properties are estimated using an iterative algorithm which attemp
ts to match calculated radiance values to those observed in photographs. Re
sults are presented for both synthetic and real scenes that show the qualit
y of the reconstructed data and its use in off-line mixed-reality applicati
ons.