Hostname: page-component-cd9895bd7-gvvz8 Total loading time: 0 Render date: 2025-01-05T20:50:21.326Z Has data issue: false hasContentIssue false

A General Method of Empirical Q-matrix Validation

Published online by Cambridge University Press:  01 January 2025

Jimmy de la Torre*
Affiliation:
Rutgers, The State University of New Jersey
Chia-Yi Chiu
Affiliation:
Rutgers, The State University of New Jersey
*
Correspondence should be made to Jimmy de la Torre, Department of Educational Psychology, Rutgers, The State University of New Jersey, 10 Seminary Place, New Brunswick, NJ 08901, USA. j.delatorre@rutgers.edu

Abstract

In contrast to unidimensional item response models that postulate a single underlying proficiency, cognitive diagnosis models (CDMs) posit multiple, discrete skills or attributes, thus allowing CDMs to provide a finer-grained assessment of examinees’ test performance. A common component of CDMs for specifying the attributes required for each item is the Q-matrix. Although construction of Q-matrix is typically performed by domain experts, it nonetheless, to a large extent, remains a subjective process, and misspecifications in the Q-matrix, if left unchecked, can have important practical implications. To address this concern, this paper proposes a discrimination index that can be used with a wide class of CDM subsumed by the generalized deterministic input, noisy “and” gate model to empirically validate the Q-matrix specifications by identifying and replacing misspecified entries in the Q-matrix. The rationale for using the index as the basis for a proposed validation method is provided in the form of mathematical proofs to several relevant lemmas and a theorem. The feasibility of the proposed method was examined using simulated data generated under various conditions. The proposed method is illustrated using fraction subtraction data.

Type
Article
Copyright
Copyright © 2015 The Psychometric Society

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Barnes, T. (2010). Novel derivation and application of skill matrices: The q-matrix method. In Handbook on educational data mining (pp. 159172). Boca Raton: CRC Press.Google Scholar
Box, G.E.P. (1954). Some theorems on quadratic forms applied in the study of analysis of variance problems: I. Effect of inequality of variance in the one-way classification. Annals of Mathematical Statistics, 25, 290302CrossRefGoogle Scholar
Chen, J., de la Torre, J., Zhang, Z. (2013). Relative and absolute fit evaluation in cognitive diagnosis modeling. Journal of Educational Measurement, 50, 123140CrossRefGoogle Scholar
Chiu, C.-Y. (2013). Statistical refinement of the Q-matrix in cognitive diagnosis. Applied Psychological Measurement, 37, 598618CrossRefGoogle Scholar
Chiu, C.-Y., Douglas, J. (2013). A nonparametric approach to cognitive diagnosis by proximity to ideal response patterns. Journal of Classication, 30, 225250Google Scholar
de la Torre, J. (2008). An empirically based method of Q-matrix validation for the DINA model: Development and applications. Journal of Educational Measurement, 45, 343362CrossRefGoogle Scholar
de la Torre, J. (2011). The generalized DINA model framework. Psychometrika, 76, 179199CrossRefGoogle Scholar
de la Torre, J., Douglas, J. (2004). Higher order latent trait models for cognitive diagnosis. Psychometrika, 63, 333353CrossRefGoogle Scholar
de la Torre, J., van der Ark, L. A., & Rossi, G. (in press). Analysis of clinical data from a cognitive diagnosis modeling framework. Measurement and Evaluation in Counseling and Development.Google Scholar
DeCarlo, L.T. (2012). Recognizing uncertainty in the Q-matrix via a Bayesian extension of the DINA model. Applied Psychological Measurement, 36, 447468CrossRefGoogle Scholar
DiBello, L.V., Roussos, L.A., Stout, W.F. 2007. Review of cognitively diagnostic assessment and summary of psychometric models. In Rao, C.R., Sinharay, S. (Eds.), Handbook of statistics, Vol. 26, psychometrics, (pp. 9791030). Amsterdam: ElsevierGoogle Scholar
Doornik, J.A. 2007. Object-oriented matrix programming using Ox, (3rd ed.). London: Timberlake Consultants PressGoogle Scholar
Fu, J., & Li, Y. (2007). An integrative review of cognitively diagnostic psychometric models. Paper presented at the Annual Meeting of the National Council of Measurement in Education, Chicago, IL.Google Scholar
Haberman, S.J., von Davier, M. Some notes on models for cognitively based skill diagnosis. In Rao, C.R., Sinharay, S., Handbook of statistics, vol. 26, psychometrics, 2007 (pp. 10311038). Amsterdam: ElsevierGoogle Scholar
Haertel, E.H. (1984). An application of latent class models to assessment data. Applied Psychological Measurement, 8, 333346CrossRefGoogle Scholar
Hartz, S. M., & Roussos, L. A. (2008). The Fusion Model for skills diagnosis: Blending theory with practice. Educational Testing Service, Research Report, RR-08-71. Princeton, NJ: Educational Testing Service.Google Scholar
Henson, R.A., Templin, J.L., Willse, J. (2009). Defining a family of cognitive diagnosis models using log-linear models with latent variables. Psychometrika, 74, 191210CrossRefGoogle Scholar
Jaeger, J., Tatsuoka, C., Berns, S. (2003). Innovation methods for extracting valid cognitive deficit profiles from NP test data in schizophrenia. Schizophrenia Research, 60, 140140CrossRefGoogle Scholar
Junker, B.W., Sijtsma, K. (2001). Cognitive assessment models with few assumptions, and connections with nonparametric item response theory. Applied Psychological Measurement, 25, 258272CrossRefGoogle Scholar
Liu, J., Xu, G., Ying, Z. (2012). Data-driven learning of Q-matrix. Applied Psychological Measurement, 36, 548564CrossRefGoogle ScholarPubMed
Maris, E. (1999). Estimating multiple classification latent class models. Psychometrika, 64, 187212CrossRefGoogle Scholar
McCullagh, P., Nelder, J. 1999. Generalized linear models, (2nd ed.). Boca Raton, FL: Chapman and HallGoogle Scholar
Rupp, A.A., Templin, J.L. (2008). The effect of Q-matrix misspecification on parameter estimates and misclassification rates in the DINA model. Educational and Psychological Measurement, 68, 7896CrossRefGoogle Scholar
Rupp, A.A., Templin, J.L. (2008). Unique characteristics of diagnostic classification models: A comprehensive review of the current state-of-the-art. Measurement, 6, 219262Google Scholar
Rupp, A.A., Templin, J.L., Henson, R.A. (2010). Diagnostic measurement: Theory, methods, and applications, New York, NY: GuilfordGoogle Scholar
Stevens, J.P. 2009. Applied multivariate statistics for the social sciences, (5th ed.). Mahwah, NJ: ErlbaumGoogle Scholar
Tatsuoka, K.K. (1983). Rule-space: An approach for dealing with misconception based on item response theory. Journal of Educational Measurement, 20, 345354CrossRefGoogle Scholar
Tatsuoka, K. K. (1990). Toward an integration of item-response theory and cognitive error diagnosis. In Frederiksen, N., Glaser, R., Lesgold, A. & Shafto, M. (Eds.), Diagnostic monitoring of skill and knowledge acquisition (pp. 453488). Hillsdale, NJ: Erlbaum.Google Scholar
Templin, J. L., & Henson, R. A. (2006a). A Bayesian method for incorporating uncertainty into Q-matrix estimation in skills assessment. Paper Presented at the Annual Meeting of the National Council on Measurement in Education, San Francisco, CA.Google Scholar
Templin, J.L., Henson, R.A. (2006). Measurement of psychological disorders using cognitive diagnosis models. Psychological Methods, 11, 287305CrossRefGoogle ScholarPubMed
von Davier, M. (2005). A general diagnostic model applied to language testing data. Educational Testing Service, Research Report, RR-05-16.CrossRefGoogle Scholar