We investigate the number of iterations needed by an addition algorithm due
to Burks et al. if the input is random. Several authors have obtained resu
lts on the average case behaviour, mainly using analytic techniques based o
n generating functions. Here we take a more probabilistic view which leads
to a limit theorem for the distribution of the random number of steps requi
red by the algorithm and also helps to explain the limiting logarithmic per
iodicity as a simple discretization phenomenon.