The standard technique for measuring the phase of a single-mode field is he
terodyne detection. Such a measurement may have an uncertainty far above th
e intrinsic quantum phase uncertainty of the state. Recently it has been sh
own [H. M. Wiseman and R. B. Killip, Phys. Rev. A 57, 2169 (1998)] that an
adaptive technique introduces far less excess noise. Here we quantify this
difference by an exact numerical calculation of the minimum measured phase
variance for the various schemes, optimized over states with a fixed mean p
hoton number. We also analytically derive the asymptotics for these varianc
es. For the case of heterodyne detection our results disagree with the powe
r law claimed by D'Ariano and Paris [Phys. Rev. A 49, 3022 (1994)]. [S1050-
2947(99)04009-3].