A numerical algorithm is presented for estimating whether, and roughly
to what extent, a time series is noise corrupted. Using phase-randomi
zed surrogates constructed from the original signal, metrics are defin
ed which can be used to quantify the noise level. A saturation occurs
in these metrics at signal to noise ratios (SNRs) of around 0 dB and b
elow, and also at around 20 dB and above. In between these two regions
there is a monotonic transition in the value of the metrics from one
region to the other corresponding to changes in the SNR. (C) 1997 Amer
ican Institute of Physics.