The accessible time resolution in femtosecond infrared experiments is
shorter than the typical phase relaxation time of a vibronic transitio
n. Therefore, coherent interaction of the light pulses with the sample
may disturb the observed absorbance signals. Coherence results in an
artifact known as perturbed free induction decay, which may be misinte
rpreted as an intrinsic incoherent temporal evolution of the sample. I
n the present paper, a model is presented describing this effect for t
he general situation, where a complex molecule containing many overlap
ping vibrational modes is investigated. The model leads to an efficien
t linear least square fit algorithm allowing the analysis of huge data
sets. The model and the fit algorithm are applied to transient absorb
ance changes observed in a large dye molecule. It is demonstrated that
it is possible to separate an ultrafast energy relaxation process fro
m the perturbed free induction decay signal. In addition, the analysis
of the perturbed free induction decay effect itself allows one to obt
ain information on the instantaneous absorbance change of the sample.