The Cramér–Rao inequality for the variance of an unbiassed estimator is first recalled, and the value of the ‘ideal' equation of estimation existing if the minimum bound is reached illustrated by examples. When this estimation equation is not available, the more general inequality due to Kiefer is more relevant. The analogous estimation equation corresponding to the attaining of the more general minimum bound is again illustrated by examples; and the general theory, which is also available for more than one parameter, is indicated for any location and/or scaling problem.