Hostname: page-component-cd9895bd7-q99xh Total loading time: 0 Render date: 2025-01-06T02:30:32.191Z Has data issue: false hasContentIssue false

Problems with EM Algorithms for ML Factor Analysis

Published online by Cambridge University Press:  01 January 2025

P. M. Bentler*
Affiliation:
University of California, Los Angeles
Jeffrey S. Tanaka
Affiliation:
University of California, Los Angeles
*
Requests for reprints should be sent to P. M. Bentler, Department of Psychology, University of California, Los Angeles, California 90024.

Abstract

Rubin and Thayer recently presented equations to implement maximum likelihood (ML) estimation in factor analysis via the EM algorithm. They present an example to demonstrate the efficacy of the algorithm, and propose that their recovery of multiple local maxima of the ML function “certainly should cast doubt on the general utility of second derivatives of the log likelihood as measures of precision of estimation.” It is shown here, in contrast, that these second derivatives verify that Rubin and Thayer did not find multiple local maxima as claimed. The only known maximum remains the one found by Jöreskog over a decade earlier. The standard errors obtained from the second derivatives and the Fisher information matrix thus remain appropriate where ML assumptions are met. The advantages of the EM algorithm over other algorithms for ML factor analysis remain to be demonstrated.

Type
Original Paper
Copyright
Copyright © 1983 The Psychometric Society

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

Supported in part by grants DA00017 and DA01070 from the U.S. Public Health Service.

References

Reference Note

Schoenberg, R. Personal communication.Google Scholar

References

Avriel, M. Nonlinear programming, analysis and methods, Englewood Cliffs, N. J.: Prentice-Hall, 1976.Google Scholar
Bentler, P. M., & Lee, S. Y. Newton-Raphson approach to exploratory and confirmatory maximum likelihood factor analysis. Journal of the Chinese University of Hong Kong, 1979, 5, 562573.Google Scholar
Bentler, P. M., & Weeks, D. G. Linear structural equations with latent variables. Psychometrika, 1980, 45, 289308.CrossRefGoogle Scholar
Dempster, A. P., Laird, N. M., & Rubin, D. B. Maximum likelihood from incomplete data via the EM algorithm. Journal of The Royal Statistical Society, 1977, 39, 122.CrossRefGoogle Scholar
IMSL, Library, Edition 9, Houston: IMSL, 1982.Google Scholar
Jöreskog, K. G. A general approach to confirmatory maximum likelihood factor analysis. Psychometrika, 1969, 34, 183202.CrossRefGoogle Scholar
Jöreskog, K. G., & Sörbom, D. LISREL V. Technical Report, Department of Statistics, University of Uppsala, 1981.Google Scholar
Lee, S. Y., & Bentler, P. M. Some asymptotic properties of constrained generalized least squares estimation in covariance structure models. South African Statistical Journal, 1980, 14, 121136.Google Scholar
Mäkeläinen, T., Schmidt, K., & Styan, G. P. H. On the existence and uniqueness of the maximum likelihood estimate of a vector-valued parameter in fixed-size samples. The Annals of Statistics, 1981, 9, 758767.CrossRefGoogle Scholar
Rubin, D. B., & Thayer, D. T. EM algorithms for ML factor analysis. Psychometrika, 1982, 47, 6976.CrossRefGoogle Scholar
Shapiro, A., & Browne, M. W. On the investigation of local identifiability: A counterexample. Technical Report, Department of Statistics, University of south Africa, 1982.Google Scholar