We analyze systematic (classical) and fundamental (quantum) limitations of
the sensitivity of optical magnetometers resulting from ac Stark shifts. We
show that in contrast to absorption-based techniques, the signal reduction
associated with classical broadening can be compensated in magnetometers b
ased on phase measurements using electromagnetically induced transparency (
EIT). However due to ac Stark-associated quantum noise the signal-to-noise
ratio of FIT-based magnetometers attains a maximum value at a certain laser
intensity. This value is independent on the quantum statistics of the ligh
t and defines a standard quantum limit of sensitivity. We demonstrate that
an EIT-based optical magnetometer in Faraday configuration is the best cand
idate to achieve the highest sensitivity of magnetic-field detection and gi
ve a detailed analysis of such a device.