We present a self-consistent theory, as well as an illustrative application
to a realistic system, of phase control of photoabsorption in an optically
dense medium. We demonstrate that, when propagation effects are taken into
consideration, the impact on phase control is significant. Independent of
the value of the Initial phase difference between the two fields, over a sh
ort scaled distance of propagation, the medium tends to settle the relative
phase so that it cancels the atomic excitation. In addition, we find some
rather unusual behavior for an optically thin layer.