Skip to main content Accessibility help
×
Hostname: page-component-78c5997874-m6dg7 Total loading time: 0 Render date: 2024-11-10T12:17:11.520Z Has data issue: false hasContentIssue false

6 - Psychometrics in Clinical Psychological Research

from Part II - Observational Approaches

Published online by Cambridge University Press:  23 March 2020

Aidan G. C. Wright
Affiliation:
University of Pittsburgh
Michael N. Hallquist
Affiliation:
Pennsylvania State University
Get access

Summary

High-quality, informative research in clinical psychology depends on the use of measures that have sound psychometric properties. Reliance upon psychometrically poor measures can produce results that are misleading both quantitatively and conceptually. This chapter articulates the implications that psychometric quality has for clinical research, outlines fundamental psychometric principles, and presents recent trends in psychometric theory and practice. Specifically, this chapter discusses the meaning and importance of measures’ dimensionality, reliability, and validity, and outlines the diverse methods for evaluating those important psychometric properties. In doing so, it highlights the utility of procedures and perspectives such as confirmatory factor analysis, exploratory structural equation modeling, classical test theory, item response theory, and contemporary views on validity. It concludes with a brief comment about the process of creating and refining clinical measures. The chapter’s ultimate goal is to enhance researchers’ ability to produce high-quality and informative clinical research.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2020

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Aguinis, H., Culpepper, S. A., & Pierce, C. A. (2010). Revival of Test Bias Research in Preemployment Testing. Journal of Applied Psychology, 95, 648680.Google Scholar
American Educational Research Association (AERA), American Psychological Association(APA), & National Council on Measurement in Education (NCME). (2014). Standards for Educational and Psychological Testing. Washington, DC: American Educational Research Association.Google Scholar
Bentler, P. M. (2017). Specificity-Enhanced Reliability Coefficients. Psychological Methods, 22, 527540.Google Scholar
Bornstein, R. F. (2011). Toward a Process-Focused Model of Test Score Validity: Improving Psychological Assessment in Science and Practice. Psychological Assessment, 23, 532544.Google Scholar
Borsboom, D., Mellenbergh, G. J., & Van Heerden, J. (2004). The Concept of Validity. Psychological Review, 111, 10611071.Google Scholar
Brown, T. A. (2015). Confirmatory Factor Analysis for Applied Research (2nd edn.). New York: Guilford Press.Google Scholar
Byrne, B. M. (2012). Structural Equation Modeling with Mplus: Basic Concepts, Applications, and Programming. New York: Routledge.Google Scholar
Byrne, B. M. (2016). Structural Equation Modeling with AMOS: Basic Concepts, applications, and Programming (3rd edn.). New York: Routledge.Google Scholar
Campbell, D. T., & Fiske, D. W. (1959). Convergent and Discriminant Validation by the Multitrait-Multimethod Matrix. Psychological Bulletin, 56, 81104.Google Scholar
Cizek, G. J. (2012). Defining and Distinguishing Validity: Interpretations of Score Meaning and Justifications of Test Use. Psychological methods, 17, 3143.Google Scholar
Clark, L. A., & Watson, D. (1995). Constructing Validity: Basic Issues in Objective Scale Development. Psychological Assessment, 7, 309319.Google Scholar
Clark, L. A., & Watson, D. (2019). Constructing Validity: New Developments in Creating Objective Measuring Instruments. Psychological Assessment. Retrieved from http://dx.doi.org/10.1037/pas0000626Google Scholar
Cronbach, L. J., & Meehl, P. E. (1955). Construct Validity in Psychological Tests. Psychological Bulletin, 51, 281302.CrossRefGoogle Scholar
Danner, D., Blasius, J., Breyer, B., Eifler, S., Menold, N., Paulhus, D. L., … & Ziegler, M. (2016). Current Challenges, New Developments, and Future Directions in Scale Construction. European Journal of Psychological Assessment, 32, 175180.Google Scholar
De la Torre, G. G., Perez, M. J., Ramallo, M. A., Randolph, C., & González-Villegas, M. B. (2016). Screening of Cognitive Impairment in Schizophrenia: Reliability, Sensitivity, and Specificity of the Repeatable Battery for the Assessment of Neuropsychological Status in a Spanish Sample. Assessment, 23, 221231.Google Scholar
Diamantopoulos, A., & Siguaw, J. A. (2000). Introducing LISREL: A Guide for the Uninitiated. London: Sage.Google Scholar
Eid, M., Nussbeck, F. W., Geiser, C., Cole, D. A., Gollwitzer, M., & Lischetzke, T. (2008). Structural Equation Modeling of Multitrait-Multimethod Data: Different Models for Different Types of Methods. Psychological Methods, 13, 230253.Google Scholar
Embretson, S. E. (2016). Understanding Examinees’ Responses to Items: Implications for Measurement. Educational Measurement: Issues and Practice, 35, 622.Google Scholar
Fabrigar, L. R., Wegener, D. T., MacCallum, R. C., & Strahan, E. J. (1999). Evaluating the Use of Exploratory Factor Analysis in Psychological Research. Psychological Methods, 4, 272299.Google Scholar
Feldt, L. S., Woodruff, D. J., & Salih, F. A. (1987). Statistical Inference for Coefficient Alpha. Applied Psychological Measurement, 11, 93103.Google Scholar
Floyd, F. J., & Widaman, K. F. (1995). Factor Analysis in the Development and Refinement of Clinical Assessment Instruments. Psychological Assessment, 7(3), 286299.Google Scholar
Fuller-Tyszkiewicz, M., Hartley-Clark, L., Cummins, R. A., Tomyn, A. J., Weinberg, M. K., & Richardson, B. (2017). Using Dynamic Factor Analysis to Provide Insights into Data Reliability in Experience Sampling Studies. Psychological Assessment, 29, 11201128.Google Scholar
Furr, R. M. (2011). Scale Construction and Psychometrics for Social and Personality Psychology. London: Sage.Google Scholar
Furr, R. M. (2018). Psychometrics: An Introduction (3rd edn.). Thousand Oaks, CA: Sage.Google Scholar
Furr, R. M., & Heuckeroth, S. A. (2018). qcv: Quantifying Construct Validity. R Package Version 1.0. Retrieved from https://cran.r-project.org/package=qcvGoogle Scholar
Furr, R. M., & Heuckeroth, S. A. (2019). The “Quantifying Construct Validity” Procedure: Its Role, Value, Interpretations, and Computation. Assessment, 26, 555566.Google Scholar
Garcia, A. F., Berzins, T., Acosta, M., Pirani, S., & Osman, A. (2018). The Anxiety Depression Distress Inventory-27 (ADDI-27): New Evidence of Factor Structure, Item-Level Measurement Invariance, and Validity. Journal of Personality Assessment, 100, 321332.Google Scholar
Green, S. B., & Yang, Y. (2009). Commentary on Coefficient Alpha: A Cautionary Tale. Psychometrika, 74, 121135.Google Scholar
Haynes, S. N., Richard, D. C. S., & Kubany, E. S. (1995). Content Validity in Psychological Assessment: A Functional Approach to Concepts and Methods. Psychological Assessment, 7, 238247.Google Scholar
Hayton, J. C., Allen, D. G., & Scarpello, V. (2004). Factor Retention Decisions in Exploratory Factor Analysis: A Tutorial on Parallel Analysis. Organizational Research Methods, 7, 191205.Google Scholar
Hill, R. W., Huelsman, T. J., Furr, R. M., Kibler, J., Vicente, B. B., & Kennedy, C. (2004). A New Measure of Perfectionism: The Perfectionism Inventory (PI). Journal of Personality Assessment, 82, 8091.Google Scholar
Hubley, A.M., & Zumbo, B.D. (2017). Response Processes in the Context of Validity: Setting the Stage. In Zumbo, B. D. and Hubley, A.M. (Eds.), Understanding and Investigating Response Processes in Validation Research (pp. 112). New York: Springer.Google Scholar
Huprich, S. K., Paggeot, A. V., & Samuel, D. B. (2015) Comparing the Personality Disorder Interview for DSM-IV (PDI-IV) and SCID-II Borderline Personality Disorder Scales: An Item-Response Theory Analysis. Journal of Personality Assessment, 97, 1321.Google Scholar
Ivie, J. L., & Embretson, S. E. (2010). Cognitive Process Modeling of Spatial Ability: The Assembling Objects Task. Intelligence, 38, 324335.Google Scholar
Kane, M. T. (2013). Validating the Interpretations and Uses of Test Scores. Journal of Educational Measurement, 50, 173.Google Scholar
Koller, I., Levenson, M. R., Glück, J. (2017). What Do You Think You Are Measuring? A Mixed-Methods Procedure for Assessing the Content Validity of Test Items and Theory-Based Scaling. Frontiers in Psychology, 8, 120.Google Scholar
Law, M. K., Furr, R. M., Arnold, E. M., Mneimne, M., Jaquett, C., & Fleeson, W. (2015). Does Asking Frequently and Repeatedly about Suicide Cause Harm? A Randomized Control Study. Psychological Assessment, 27, 11711181.CrossRefGoogle Scholar
Markus, K. A., & Borsboom, D. (2013). Frontiers of Test Validity Theory: Measurement, Causation, and Meaning. New York: Routledge.Google Scholar
Marsh, H. W., Morin, A. J. S., Parker, P. D., Kaur, G. (2014) Exploratory Structural Equation Modeling: An Integration of the Best Features of Exploratory and Confirmatory Factor Analysis. Annual Review of Clinical Psychology, 10, 85110CrossRefGoogle ScholarPubMed
McDonald, R. P. (1999) Test Theory: A Unified Treatment. Mahwah, NJ: Lawrence Erlbaum.Google Scholar
McNeish, D. (2018). Thanks Coefficient Alpha, We’ll Take It from Here. Psychological Methods, 23, 412433.CrossRefGoogle Scholar
Messick, S. (1989). Validity. In Linn, R. L. (Ed.), Educational Measurement (3rd edn., pp. 13103). New York: Macmillan.Google Scholar
Mihura, J. L., Dumitrascu, N., Roy, M., & Meyer, G. J. (2019). The Centrality of the Response Process in Construct Validity: An Illustration via the Rorschach Space Response. Journal of Personality Assessment, 101, 374392.Google Scholar
Morell, L., & Tan, R. J. B. (2009). Validating for Use and Interpretation: A Mixed Methods Contribution Illustrated. Journal of Mixed Methods Research, 3, 242264.Google Scholar
Muthén, L. K., & Muthén, B. O. (1998‒2017). Mplus User’s Guide (8th edn.). Los Angeles, CA: Muthén & Muthén.Google Scholar
Newman, I., Lim, J., & Pineda., F. (2013). Content Validity Using a Mixed Methods Approach: Its Application and Development through the Use of a Table of Specifications Methodology. Journal of Mixed Methods Research, 7, 243260.Google Scholar
Newman, M. G., Zuellig, A. R., Kachin, K. E., Constantino, M. J., Przeworski, A., Erickson, T., & Cashman-McGrath, L. (2002). Preliminary Reliability and Validity of the Generalized Anxiety Disorder Questionnaire – IV: A Revised Self-Report Diagnostic Measure of Generalized Anxiety Disorder. Behavior Therapy, 33, 215233.Google Scholar
Nezlek, J. B. (2017). A Practical Guide to Understanding Reliability in Studies of Within-Person Variability. Journal of Research in Personality, 69, 149155.Google Scholar
O’Connor, B. P. (2000). SPSS and SAS Programs for Determining the Number of Components using Parallel Analysis and Velicer’s MAP Test. Behavior Research Methods, Instrumentation, and Computers, 32, 396402.Google Scholar
O’Rourke, N., & Hatcher, L. (2013). A Step-by-Step Approach to Using the SAS System for Factor Analysis and Structural Equation Modeling (2nd edn.). Cary, NC: SAS Institute.Google Scholar
Osburn, H. G. (2000). Coefficient Alpha and Related Internal Consistency Reliability Coefficients. Psychological Methods, 5, 343355.CrossRefGoogle ScholarPubMed
Ozer, D. J. (1989). Construct Validity in Personality Assessment. In Buss, D. & Cantor, N. (Eds.), Personality Psychology: Recent Trends and Emerging Directions (pp. 225234). New York: Springer-Verlag.Google Scholar
Pilkonis, P. A., Choi, S. W., Reise, S. P., Stover, A. M., Riley, W. T., & Cella, D. (2011). Item Banks for Measuring Emotional Distress from the Patient-Reported Outcomes Measurement Information System (PROMIS®): Depression, Anxiety, and Anger. Assessment, 18, 263283.Google Scholar
Poythress, N. G., Lilienfeld, S. O., Skeem, J. L., Douglas, K. S., Edens, J. F., Epstein, M., & Patrick, C. J. (2010). Using the PCL-R to Help Estimate the Validity of Two Self-Report Measures of Psychopathy with Offenders. Assessment, 17, 206219Google Scholar
Raykov, T. (2002). Examining Group Differences in Reliability of Multiple-Component Instruments. British Journal of Mathematical and Statistical Psychology, 55, 145158.Google Scholar
Raykov, T. (2004). Behavioral Scale Reliability and Measurement Invariance Evaluation using Latent Variable Modeling. Behavior Therapy, 35, 299331.Google Scholar
Raykov, T., & Marcoulides, G. A. (2011). Introduction to Psychometric Theory. New York: Routledge, Taylor & Francis Publishers.Google Scholar
Raykov, T., & Marcoulides, G. A. (2019). Thanks, Coefficient Alpha ‒ We Still Need You! Educational and Psychological Measurement, 79(1), 200210.Google Scholar
Reise, S. P., & Waller, N. G. (2009). Item Response Theory and Clinical Measurement. Annual Review of Clinical Psychology, 5, 2546.Google Scholar
Reise, S. P., Waller, N. G., & Comrey, A. L. (2000). Factor Analysis and Scale Revision. Psychological Assessment, 12, 287297.Google Scholar
Revelle, W. (2017) Psych: Procedures for Personality and Psychological Research, Northwestern University, Evanston, Illinois, USA. Retrieved from https://CRAN.R-project.org/package=psych Version = 1.7.5.Google Scholar
Revelle, W., & Rocklin, T. (1979). Very Simple Structure: An Alternative Procedure for Estimating the Optimal Number of Interpretable Factors. Multivariate Behavioral Research, 14, 403414.Google Scholar
Revelle, W., & Zinbarg, R. E. (2009). Coefficients Alpha, Beta, Omega and the Glb: Comments on Sijtsma. Psychometrika, 74(1), 145154.Google Scholar
Reynolds, C. R., & Suzuki, L. (2013). Bias in Psychological Assessment: An Empirical Review and Recommendations. In Graham, J. R., Naglieri, J. A., & Weiner, I. B. (Eds.), Handbook of Psychology, Volume 10: Assessment Psychology (2nd edn., pp. 82113). Hoboken, NJ: Wiley.Google Scholar
Rodriguez, A., Reise, S. P., & Haviland, M. G. (2016). Evaluating Bifactor Models: Calculating and Interpreting Statistical Indices. Psychological Methods, 21(2), 137150.Google Scholar
Rosseel, Y. (2012). Lavaan: An R Package for Structural Equation Modeling. Journal of Statistical Software, 48, 136.Google Scholar
Ruscio, J., & Roche, B. (2012). Determining the Number of Factors to Retain in an Exploratory Factor Analysis Using Comparison Data of Known Factorial Structure. Psychological Assessment, 24, 282292.Google Scholar
Siefert, C. J., Stein, M., Slavin-Mulford, J., Haggerty, G., Sinclair, S. J., Funke, D., & Blais, M. A. (2018). Exploring the Factor Structure of the Social Cognition and Object Relations – Global Rating Method: Support for Two- and Three-Factor Models. Journal of Personality Assessment, 100, 122134.Google Scholar
Sireci, S. G. (2016). On the Validity of Useless Tests. Assessment in Education: Principles, Policy & Practice, 23, 226235.Google Scholar
Smith, G. T. (2005). On Construct Validity: Issues of Method and Measurement. Psychological Assessment, 17, 396408.Google Scholar
Sunderland, M., Batterham, P., Carragher, N., Calear, A., & Slade, T. (2019). Developing and Validating a Computerized Adaptive Test to Measure Broad and Specific Factors of Internalizing in a Community Sample. Assessment, 25, 10301045.Google Scholar
Tellegen, A., & Waller, N. G. (2008). Exploring Personality through Test Construction: Development of the Multidimensional Personality Questionnaire. In Boyle, G. J., Matthews, G., & Saklofske, D. H. (Eds.), The SAGE Handbook of Personality Theory and Assessment: Vol. 2. Personality Measurement and Testing (pp. 261292). London: Sage.Google Scholar
Thomas, M. (2011). The Value of Item Response Theory in Clinical Assessment: A Review. Assessment, 18, 291307Google Scholar
Vandenberg, R. J., & Lance, C. E. (2000). A Review and Synthesis of the Measurement Invariance Literature: Suggestions, Practices, and Recommendations for Organizational Research. Organizational Research Methods, 3, 470.Google Scholar
Velicer, W. F. (1976). Determining the Number of Components from the Matrix of Partial Correlations. Psychometrika, 41, 321327.Google Scholar
Walker, C. (2011). What’s the DIF? Why Differential Item Functioning Analyses Are an Important Part of Instrument Development and Validation. Journal of Psychoeducational Assessment, 29, 364376.Google Scholar
Westen, D., & Rosenthal, R. (2003). Quantifying Construct Validity: Two Simple Measures. Journal of Personality and Social Psychology, 84, 608618.Google Scholar
Widaman, K. F. (1993). Common Factor Analysis versus Principal Component Analysis: Differential Bias in Representing Model Parameters? Multivariate Behavioral Research, 28, 263311.Google Scholar
Wright, A. G. C. (2017). The Current State and Future of Factor Analysis in Personality Disorder Research. Personality Disorders: Theory, Research, and Treatment, 8, 1425.CrossRefGoogle ScholarPubMed
Zettler, I., Lang, J. W. B., Hülsheger, U. R., & Hilbig, B. E. (2016). Dissociating Indifferent, Directional, and Extreme Responding in Personality Data: Applying the Three-Process Model to Self- and Observer Reports. Journal of Personality, 84, 461472.Google Scholar
Ziegler, M., & Hagemann, D. (2015). Testing the Unidimensionality of Items. European Journal of Psychological Assessment, 31, 231237.Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×