Least-squares monotone regression has received considerable discussion and use. Consider the residual sum of squares Q obtained from the least-squares monotone regression of yi on xi. Treating Q as a function of the yi, we prove that the gradient ▽Q exists and is continuous everywhere, and is given by a simple formula. (We also discuss the gradient of d = Q1/2.) These facts, which can be questioned (Louis Guttman, private communication), are important for the iterative numerical solution of models, such as some kinds of multidimensional scaling, in which monotone regression occurs as a subsidiary element, so that the yi and hence indirectly Q are functions of other variables.