Timebase distortion causes nonlinear distortion of waveforms measured by sa
mpling instruments. When such instruments are used to measure the rms ampli
tude of the sampled waveforms, such distortions result in errors in the mea
sured root-mean squared (rms) values. This paper looks at the nature of the
errors that result from nonrandom quantization errors in an instrument's t
imebase circuit. Simulations and measurements on a sampling voltmeter show
that the errors in measured rms amplitude have a nonnormal probability dist
ribution, such that the probability of large errors is much greater than wo
uld be expected from the usual quantization noise model. A novel timebase c
ompensation method is proposed which makes the measured rms errors normally
distributed and reduces their standard deviation by a factor of 25. This c
ompensation method was applied to a sampling voltmeter and the improved acc
uracy was realized.