Hostname: page-component-5f745c7db-96s6r Total loading time: 0 Render date: 2025-01-06T06:55:59.437Z Has data issue: true hasContentIssue false

Approximated Penalized Maximum Likelihood for Exploratory Factor Analysis: An Orthogonal Case

Published online by Cambridge University Press:  01 January 2025

Shaobo Jin*
Affiliation:
Uppsala University
Irini Moustaki
Affiliation:
London School of Economics and Political Science
Fan Yang-Wallentin
Affiliation:
Uppsala University
*
Correspondence should be made to Shaobo Jin, Department of Statistics, Uppsala University, Uppsala, Sweden.Email: shaobo.jin@statistik.uu.se

Abstract

The problem of penalized maximum likelihood (PML) for an exploratory factor analysis (EFA) model is studied in this paper. An EFA model is typically estimated using maximum likelihood and then the estimated loading matrix is rotated to obtain a sparse representation. Penalized maximum likelihood simultaneously fits the EFA model and produces a sparse loading matrix. To overcome some of the computational drawbacks of PML, an approximation to PML is proposed in this paper. It is further applied to an empirical dataset for illustration. A simulation study shows that the approximation naturally produces a sparse loading matrix and more accurately estimates the factor loadings and the covariance matrix, in the sense of having a lower mean squared error than factor rotations, under various conditions.

Type
Original Paper
Copyright
Copyright © The Psychometric Society 2018

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

Electronic supplementary material The online version of this article (https://doi.org/10.1007/s11336-018-9623-z) contains supplementary material, which is available to authorized users.

References

Bernaards, C. A., &Jennrich, R. I. (2005). Gradient projection algorithms and software for arbitrary rotation criteria in factor analysis.Educational and Psychological Measurement, 65,676696.CrossRefGoogle Scholar
Björck, A. (1996). Numerical methods for least squares problems.Philadelphia, PA:SIAM.CrossRefGoogle Scholar
Brehenv, P., &Huang, J. (2011). Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection.The Annals of Applied Statistics, 5,232253.Google Scholar
Browne, M. V. (2001). An overview of analytic rotation in exploratory factor analysis.Multivariate Behavioral Research, 36,111150.CrossRefGoogle Scholar
Browne, M. W., &Du Toit, S. HC. (1992). Automated fitting of nonstandard models.Multivariate Behavioral Research, 27,269300.CrossRefGoogle ScholarPubMed
Carroll, J. B. (1953). An analytic rotation for approximating simple structure in factor analysis.Psychometrika, 18,2338.CrossRefGoogle Scholar
Choi, J.,Zou, H., &Oehlert, G. (2010). A penalized maximum likelihood approach to sparse factor analysis.Statistics and Its Interface, 3,429436.CrossRefGoogle Scholar
Du Toit, M.,Du Toit, S., &Hawkins, D. M. (2001). Interactive LISREL: User’s guide.Linconwood, IL:Scientific Software International.Google Scholar
Eddelbuettel, D. (2013). Seamless R and C++ integration with Rcpp.New York:Springer.CrossRefGoogle Scholar
Eddelbuettel, D., &François, R. (2011). Rcpp: Seamless R and C++ integration.Journal of Statistical Software, 40,118.CrossRefGoogle Scholar
Efron, B.,Hastie, T.,Johnstone, I., &Tibshirani, R. (2004). Least angle regression.The Annals of Statistics, 32,407840.CrossRefGoogle Scholar
Fan, J., &Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties.Journal of the American Statistical Association, 96,13481360.CrossRefGoogle Scholar
Friedman, J.,Hastie, T.,Höfling, H., &Tibshirani, R. (2007). Pathwise coordinate optimization.The Annals of Applied Statistics, 321,302332.Google Scholar
Friedman, J.,Hastie, T., &Tibshirani, R. (2010). Regularization paths for generalized linear models via coordinate descent.Journal of Statistical Software, 33 122.CrossRefGoogle ScholarPubMed
Garcia, R. I.,Ibrahim, J. G., &Zhu, H. (2010). Variable selection for regression models with missing data.Statistica Sinica, 20,149165.Google ScholarPubMed
Hair, J.,Black, W.,Babin, B., &Anderson, R. (2010). Multivariate data analysis.7Upper Saddle River, NJ:Prentice Hall.Google Scholar
Hirose, K., &Konishi, S. (2012). Variable selection via the weighted group lasso for factor analysis models.Canadian Journal of Statistics, 40,345361.CrossRefGoogle Scholar
Hirose, K., &Yamamoto, M. (2014). Estimation of an oblique structure via penalized likelihood factor analysis.Computational Statistics and Data Analysis, 79,120132.CrossRefGoogle Scholar
Hirose, K., &Yamamoto, M. (2015). Sparse estimation via non-concave penalized likelihood in factor analysis model.Statistics and Computing, 25,863875.CrossRefGoogle Scholar
Holzinger, K., & Swineford, F. (1939). A study in factor analysis: The stability of a bifactor solution. Supplementary Educational Monograph, No. 48, Chicago, IL: University of Chicago Press.Google Scholar
Hunter, D., &Li, R. (2005). Variable selection using MM algorithms.The Annals of Statistics, 33 16171642.CrossRefGoogle ScholarPubMed
Jennrich, R. I. (2004). Rotation to simple loadings using component loss functions: The orthogonal case.Psychometrika, 69,257273.CrossRefGoogle Scholar
Jennrich, R. I. (2006). Rotation to simple loadings using component loss functions: The oblique case.Psychometrika, 71,173191.CrossRefGoogle Scholar
Jennrich, R. I. (2007). Rotation algorithms: From beginning to end.Lee, S.-Y. Handbook of latent variable and related models.Amsterdam, The Netherlands:Elsevier.4563.Google Scholar
Johnstone, I. M., &Lu, A. Y. (2012). On consistency and sparsity for principal components analysis in high dimensions.Journal of the American Statistical Association, 104,682693.CrossRefGoogle Scholar
Jöreskog, K. G. (1967). Some contributions to maximum likelihood factor analysis.Psychometrika, 32,443482.CrossRefGoogle Scholar
Jöreskog, K. G.Sörbom, D. (1993). LISREL 8: Structural equation modeling with the SIMPLIS command language.Linconwood, IL:Scientific Software International.Google Scholar
Kaiser, H. F. (1958). The varimax criterion for analytic rotation in factor analysis.Psychometrika, 23,187240.CrossRefGoogle Scholar
Lawley, D. N. (1940). The estimation of factor loadings by the method of maximum likelihood.Proceedings of the Royal Society of Edinburgh, 60,6482.CrossRefGoogle Scholar
Mazumder, R.,Friedman, J. H., &Hastie, T. (2011). SparseNet: Coordinate descent with nonconvex penalties.Journal of the American Statistical Association, 106,11251138.CrossRefGoogle ScholarPubMed
Meinshausen, N. (2007). Relaxed lasso.Computational Statistics and Data Analysis, 52,374393.CrossRefGoogle Scholar
Neuhaus, J. O., &Wrigley, C. (1954). The quartimax method: An analytical approach to orthogonal simple structure.British Journal of Mathematical and Statistical Psychology, 7,8191.CrossRefGoogle Scholar
Ning, L., & Georgiou, T. T. (2011, December). Sparse factor analysis via likelihood and l 1\documentclass[12pt]{minimal}\usepackage{amsmath}\usepackage{wasysym}\usepackage{amsfonts}\usepackage{amssymb}\usepackage{amsbsy}\usepackage{mathrsfs}\usepackage{upgreek}\setlength{\oddsidemargin}{-69pt}\begin{document}$$_1$$\end{document}-regularization. In Decision and Control and European Control Conference (CDC-ECC), 2011 50th IEEE Conference on decision and control and european control conference (pp. 5188–5192).Google Scholar
Osborne, M. R.,Presnell, B., &Turlach, B. A. (2000). On the LASSO and its dual.Journal of Computational and Graphical Statistics, 9,319337.CrossRefGoogle Scholar
Rubin, D., &Thayer, D. (1982). EM algorithms for ML factor analysis.Psychometrika, 47,6976.CrossRefGoogle Scholar
Shen, H., &Huang, J. (2008). Sparse principal component analysis via regularized low rank matrix approximation.Journal of Multivariate Analysis, 99,10151034.CrossRefGoogle Scholar
Tabachnick, B. G., &Fidell, L. S. (2001). Using multivariate statistics.Boston, MA:Allyn and Bacon.Google Scholar
Tibshirani, R. (1996). Regression shrinkage and selection via the lasso.Journal of the Royal Statistical Society: Series B (Statistical Methodology), 58,267288.CrossRefGoogle Scholar
Tibshirani, R. (2011). Regression shrinkage and selection via the lasso: A retrospective.Journal of the Royal Statistical Society: Series B (Statistical Methodology), 73,273282.CrossRefGoogle Scholar
Trendafilov, N. T. (2014). From simple structure to sparse components: A review.Computational Statistics, 29,431454.CrossRefGoogle Scholar
Trendafilov, N. T., &Adachi, K. (2015). Sparse versus simple structure loadings.Psychometrika, 80,776790.CrossRefGoogle ScholarPubMed
Trendafilov, N. T.,Fontanella, S., &Adachi, K. (2017). Sparse exploratory factor analysis.Psychometrika, 82,778794.CrossRefGoogle Scholar
Witten, D. M.,Tibshirani, R., &Hastie, T. (2009). A penalized matrix decomposition, with applications to sparse principal components and canonical correlation analysis.Biostatistics, 10,515534.CrossRefGoogle ScholarPubMed
Zhang, C. (2010). Nearly unbiased variable selection under minimax concave penalty.The Annals of Statistics, 38,894942.CrossRefGoogle Scholar
Zhang, G. (2014). Estimating standard errors in exploratory factor analysis.Multivariate Behavioral Research, 49,339353.CrossRefGoogle ScholarPubMed
Zou, H. (2006). The adaptive lasso and its oracle properties.Journal of the American Statistical Association, 101,14181429.CrossRefGoogle Scholar
Zou, H.,Hastie, T., &Tibshirani, R. (2006). Sparse principal component analysis.Journal of Computational and Graphical Statistics, 15,265286.CrossRefGoogle Scholar
Zou, H.,Hastie, T., &Tibshirani, R. (2007). On the "degrees of freedom" of the lasso.The Annals of Statistics, 35 18492311.CrossRefGoogle Scholar
Zou, H., &Li, R. (2008). One-step sparse estimates in nonconcave penalized likelihood models.The Annals of Statistics, 36,15091533.Google ScholarPubMed
Supplementary material: File

Jin et al. supplementary material

Jin et al. supplementary material 1
Download Jin et al. supplementary material(File)
File 215 KB
Supplementary material: File

Jin et al. supplementary material

Jin et al. supplementary material 2
Download Jin et al. supplementary material(File)
File 5.5 KB
Supplementary material: File

Jin et al. supplementary material

Jin et al. supplementary material 3
Download Jin et al. supplementary material(File)
File 44.7 KB
Supplementary material: File

Jin et al. supplementary material

Jin et al. supplementary material 4
Download Jin et al. supplementary material(File)
File 39.1 KB