We investigate the number of iterations needed by an addition algorithm due to
Burks et al. if the input is random. Several authors have obtained results on
the average case behaviour, mainly using analytic techniques based on generating functions. Here we
take a more probabilistic view which leads to a limit theorem for the distribution of the random
number of steps required by the algorithm and also helps to explain the limiting logarithmic
periodicity as a simple discretization phenomenon.