Published online by Cambridge University Press: 01 January 2025
This paper contains a globally optimal solution for a class of functions composed of a linear regression function and a penalty function for the sum of squared regression weights. Global optimality is obtained from inequalities rather than from partial derivatives of a Lagrangian function. Applications arise in multidimensional scaling of symmetric or rectangular matrices of squared distances, in Procrustes analysis, and in ridge regression analysis. The similarity of existing solutions for these applications is explained by considering them as special cases of the general class of functions addressed.
The author is obliged to Henk Kiers and Willem Heiser for helpful comments.