In the context of measurement and of the definition of measurement units, a problem well known in computing science, the inherent propagation and accumulation of rounding errors throughout the intermediate steps of numerical calculation, is discussed in this paper, with some issues in notation, namely of integer numbers.