Thermal vibrations destroy the perfect crystalline periodicity generally as
sumed by dynamical diffraction theories. This can lead to some difficulty i
n deriving the temperature dependence of X-ray reflectivity from otherwise
perfect crystals. This difficulty is overcome here in numerical simulations
based on the extended Darwin theory, which does not require periodicity. U
sing Si and Ge as model materials, it is shown how to map the lattice vibra
tions derived from measured phonon dispersion curves onto a suitable Darwin
model. Good agreement is observed with the usual Debye-Waller behavior pre
dicted by standard theories, except at high temperatures for high-order ref
lections. These deviations are discussed in terms of a possible breakdown o
f the ergodic hypothesis for X-ray diffraction.