Hostname: page-component-745bb68f8f-kw2vx Total loading time: 0 Render date: 2025-01-07T10:15:32.792Z Has data issue: false hasContentIssue false

GINDCLUS: Generalized INDCLUS with External Information

Published online by Cambridge University Press:  01 January 2025

Laura Bocci
Affiliation:
Sapienza University of Rome
Donatella Vicari*
Affiliation:
Sapienza University of Rome
*
Correspondence should be made to Donatella Vicari, Department of Statistical Sciences, Sapienza University of Rome, Rome, Italy. Email: donatella.vicari@uniroma1.it

Abstract

A Generalized INDCLUS model, termed GINDCLUS, is presented for clustering three-way two-mode proximity data. In order to account for the heterogeneity of the data, both a partition of the subjects into homogeneous classes and a covering of the objects into groups are simultaneously determined. Furthermore, the availability of information which is external to the three-way data is exploited to better account for such heterogeneity: the weights of both classifications are linearly linked to external variables allowing for the identification of meaningful classes of subjects and groups of objects. The model is fitted in a least-squares framework, and an efficient Alternating Least-Squares algorithm is provided. An extensive simulation study and an application on benchmark data are also presented.

Type
Original Paper
Copyright
Copyright © 2016 The Psychometric Society

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

Electronic supplementary material The online version of this article (doi:10.1007/s11336-016-9526-9) contains supplementary material, which is available to authorized users.

References

Bach, F., & Jenatton, R., & Mairal, J., & Obozinski, G. Sra, S., & Nowozin, S., & Wright, S. J. (2011). Convex optimization with sparsity-inducing norms. Optimization for machine learning, Cambridge: MIT Press.Google Scholar
Bocci, L., & Vicari, D., & Vichi, M. (2006). A mixture model for the classification of three-way proximity data. Computational Statistics & Data Analysis, 50, 16251654CrossRefGoogle Scholar
Bocci, L., & Vichi, M. (2011). The K-INDSCAL model for heterogeneous three-way dissimilarity data. Psychometrika, 76, 691714CrossRefGoogle ScholarPubMed
Bock, H. H. Bozdogan, H., & Gupta, A. K. (1987). On the interface between cluster analysis, principal component analysis, and multidimensional scaling. Multivariate statistical modeling and data analysis, New York: Reidel 1734CrossRefGoogle Scholar
Borg, I., & Groenen, P. J. F. (2005). Modern multidimensional scaling. Theory and application, Berlin: Springer.Google Scholar
Calinski, T., & Harabasz, J. (1974). A dendrite method for cluster analysis. Communications in Statistics, 3, 127.Google Scholar
Carroll, J. D., & Arabie, P. (1983). INDCLUS: An individual differences generalization of ADCLUS model and the MAPCLUS algorithm. Psychometrika, 48, 157169CrossRefGoogle Scholar
Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational and Psychological Measurement, 20, 3746CrossRefGoogle Scholar
Gill, P. E., & Murray, W., & Wright, M. H. (1981). Practical optimization, London: Academic Press.Google Scholar
Giordani, P., & Kiers, H. A. L. (2012). FINDCLUS: Fuzzy INdividual Differences CLUStering. Journal of Classification, 29, 170198CrossRefGoogle Scholar
Gordon, A. D., & Vichi, M. (1998). Partitions of partitions. Journal of Classification, 15, 265285CrossRefGoogle Scholar
Heiser, W. J. Opitz, O., & Lausen Klar, O. (1993). Clustering in low-dimensional space. Information and classification: Concepts, methods and applications, Berlin: Springer.Google Scholar
Hubert, L., & Arabie, P. (1985). Comparing partitions. Journal of Classification, 2, 193–218.CrossRefGoogle Scholar
Hubert, L. J., & Arabie, P., & Meulman, J. (2006). The structural representation of proximity matrices with MATLAB, Philadelphia: SIAMCrossRefGoogle Scholar
Kiers, H. A. L., & Vicari, V., & Vichi, M. (2005). Simultaneous classification and multidimensional scaling with external information. Psychometrika, 70, 433460CrossRefGoogle Scholar
Lawson, C. L., & Hanson, R. J. (1974). Solving least squares problems, Englewood Cliffs: Prentice Hall.Google Scholar
McDonald, R. P. (1980). A simple comprehensive model for the analysis of covariance structures: Some remarks on applications. British Journal of Mathematical and Statistical Psychology, 33, 161183CrossRefGoogle Scholar
Rao, C. R., & Mitra, S. (1971). Generalized inverse of matrices and its applications, New York: Wiley.Google Scholar
Schiffman, S. S., & Reynolds, M. L., & Young, F. W. (1981). Introduction to multidimensional scaling, London: Academic Press.Google Scholar
Shepard, R. N., & Arabie, P. (1979). Additive clustering: Representation of similarities as combinations of discrete overlapping properties. Psychological Review, 86, 87123CrossRefGoogle Scholar
Ten Berge, J. M. F., & Kiers, H. A. L. (2005). A comparison of two methods for fitting the INDCLUS model. Journal of Classification, 22, 273286CrossRefGoogle Scholar
Vicari, D., & Vichi, M. (2009). Structural classification analysis of three-way dissimilarity data. Journal of Classification, 26, 121154CrossRefGoogle Scholar
Wedel, M., & DeSarbo, W. S. (1998). Mixtures of (constrained) ultrametric trees. Psychometrika, 63, 419443CrossRefGoogle Scholar
Wilderjans, T. F., & Depril, D., & Van Mechelen, I. (2012). Block-relaxation approaches for fitting the INDCLUS model. Journal of Classification, 29, 277296CrossRefGoogle Scholar
Winsberg, S., & De Soete, G. (1993). A latent class approach to fitting the weighted Euclidean model, CLASCAL. Psychometrika, 58, 315330CrossRefGoogle Scholar
Supplementary material: File

Bocci and Vicari supplementary material

Bocci and Vicari supplementary material
Download Bocci and Vicari supplementary material(File)
File 308.5 KB