We have recently introduced a new method of searching a time series for per
iodic variability. The method uses the Shannon entropy to measure the amoun
t of information provided by a set of observations that may contain an unde
rlying periodic signal, as a function of the assumed period of this hypothe
tical periodic signal. Here we present the analytical arguments that suppor
t this algorithm within the broader frame of information theory. We also sh
ow that, in the absence of a periodic signal, the entropies follow a Gaussi
an distribution, which then provides an easy way of assessing the significa
nce of a positive detection. We test this method using simulated data with
non-sinusoidal variability, and we show that it is more sensitive than the
classical periodograms or those variations adapted to deal with cases where
harmonics are involved. Finally, we show that this method is capable of re
solving two, almost identical, frequencies present in a given time series,
even in cases where the classical periodograms fail to do so.