Hostname: page-component-78c5997874-dh8gc Total loading time: 0 Render date: 2024-11-10T10:22:31.791Z Has data issue: false hasContentIssue false

Meta-Analysis and the Myth of Generalizability

Published online by Cambridge University Press:  12 June 2017

Robert P. Tett*
Affiliation:
University of Tulsa
Nathan A. Hundley
Affiliation:
University of Tulsa
Neil D. Christiansen
Affiliation:
Central Michigan University
*
Correspondence concerning this article should be addressed to Robert P. Tett, Department of Psychology, 800 S. Tucker Ave., University of Tulsa, Tulsa, OK 74104. E-mail: robert-tett@utulsa.edu

Abstract

Rejecting situational specificity (SS) in meta-analysis requires assuming that residual variance in observed correlations is due to uncorrected artifacts (e.g., calculation errors). To test that assumption, 741 aggregations from 24 meta-analytic articles representing seven industrial and organizational (I-O) psychology domains (e.g., cognitive ability, job interviews) were coded for moderator subgroup specificity. In support of SS, increasing subgroup specificity yields lower mean residual variance per domain, averaging a 73.1% drop. Precision in mean rho (i.e., low SD(rho)) adequate to permit generalizability is typically reached at SS levels high enough to challenge generalizability inferences (hence, the “myth of generalizability”). Further, and somewhat paradoxically, decreasing K with increasing precision undermines certainty in mean r and Var(r) as meta-analytic starting points. In support of the noted concerns, only 4.6% of the 741 aggregations met defensibly rigorous generalizability standards. Four key questions guiding generalizability inferences are identified in advancing meta-analysis as a knowledge source.

Type
Focal Article
Copyright
Copyright © Society for Industrial and Organizational Psychology 2017 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

The authors gratefully acknowledge the following individuals for their helpful feedback on earlier drafts: David Fisher, Fred Oswald, Mitch Rothstein, Paul Sackett, Piers Steel, and Frances Wen. No endorsement of this work, in whole or in part, is implied.

References

References 16

*Arthur, W. Jr., Day, E. A., McNelly, T. L., & Edens, P. S. (2003). A meta-analysis of the criterion-related validity of assessment center dimensions. Personnel Psychology, 56 (1), 125154.Google Scholar
*Barrick, M. R., & Mount, M. K. (1991). The big five personality dimensions and job performance: A meta-analysis. Personnel Psychology, 44, 126.CrossRefGoogle Scholar
*Bartram, D. (2005). The Great Eight competencies: A criterion-centric approach to validation. Journal of Applied Psychology, 90, 11851203.Google Scholar
*Beus, J. M., Dhanani, L. Y., & McCord, M. A. (2015). A meta-analysis of personality and workplace safety: Addressing unanswered questions. Journal of Applied Psychology, 100 (2), 481498.Google Scholar
Biemann, T. (2013). What if we were Texas sharpshooters? Predictor reporting bias in regression analysis. Organizational Research Methods, 16 (3), 335363.CrossRefGoogle Scholar
Carlson, K. D., & Ji, F. X. (2011). Citing and building on meta-analytic findings: A review and recommendations. Organizational Research Methods, 14 (4), 696717.Google Scholar
Card, N. A. (2012). Applied meta-analysis for social science research. New York: Guilford.Google Scholar
*Choi, D., Oh, I., & Colbert, A. E. (2015). Understanding organizational commitment: A meta-analytic examination of the roles of the five-factor model of personality and culture. Journal of Applied Psychology, 100 (5), 15421567.Google Scholar
*Christian, M. S., Edwards, B. D., & Bradley, J. C. (2010). Situational judgment tests: Constructs assessed and a meta-analysis of their criterion-related validities. Personnel Psychology, 63 (1), 83117.CrossRefGoogle Scholar
Coburn, K. M., & Vevea, J. L. (2015). Publication bias as a function of study characteristics. Psychological Methods, 20 (3), 310330.Google Scholar
Cochran, W. G. (1954). The combination of estimates from different experiments. Biometrics, 10, 101129.Google Scholar
Cortina, J. M. (1993). What is coefficent alpha? An examination of theory and applications. Journal of Applied Psychology, 78, 98104.CrossRefGoogle Scholar
Dalton, D. R., Aguinis, H., Dalton, C. M., Bosco, F. A., & Pierce, C. A. (2012). Revisiting the file drawer problem in meta-analysis: An assessment of published and nonpublished correlation matrices. Personnel Psychology, 65 (2), 221249.CrossRefGoogle Scholar
Ferguson, C. J., & Brannick, M. T. (2012). Publication bias in psychological science: Prevalence, methods for identifying and controlling, and implications for the use of meta-analyses. Psychological Methods, 17 (1), 120128.CrossRefGoogle ScholarPubMed
*Gaugler, B. B., Rosenthal, D. B., Thornton III, G. C., & Bentson, C. (1987). Meta-analysis of assessment center validity. Journal of Applied Psychology, 72, 493511.CrossRefGoogle Scholar
*Gonzalez-Mulé, E., Mount, M. K., & Oh, I. (2014). A meta-analysis of the relationship between general mental ability and nontask performance. Journal of Applied Psychology, 99 (6), 12221243.Google Scholar
Higgins, J. P., Thompson, S. G., & Spiegelhalter, D. J. (2009). A re-evaluation of random-effects meta-analysis. Journal of the Royal Statistical Society: Series A (Statistics in Society), 172, 137159.Google Scholar
Hedges, L. V., & Olkin, I. (2005). Statistical methods for meta-analysis (2nd ed.). Orlando, FL: Academic Press.Google Scholar
Hedges, L. V., & Pigott, T. D. (2004). The power of statistical tests for moderators in meta-analysis. Psychological Methods, 9 (4), 426445.CrossRefGoogle ScholarPubMed
Hedges, L. V., & Vevea, J. L. (1996). Estimating effect size under publication bias: Small sample properties and robustness of a random effects selection model. Journal of Educational and Behavioral Statistics, 21 (4), 299332.Google Scholar
*Hoffman, B. J., Kennedy, C. L., LoPilato, A. C., Monahan, E. L., & Lance, C. E. (2015). A review of the content, criterion-related, and construct-related validity of assessment center exercises. Journal of Applied Psychology, 100 (4), 11431168.Google Scholar
Hoffman, C. C., & Thornton III, G. C. (1997). Examining selection utility where competing predictors differ in adverse impact. Personnel Psychology, 50 (2), 455470.Google Scholar
Hough, L. M., Ones, D. S., & Viswesvaran, C. (1998, April). Personality correlates of managerial performance constructs. Paper presented in R. C. Page (Chair), Personality determinants of managerial potential performance, progression and ascendancy. Symposium conducted at the 13th Annual Conference of the Society for Industrial Organizational Psychology, Dallas, TX.Google Scholar
*Huffcutt, A. I., Conway, J. M., Roth, P. L., & Klehe, U. (2004). The impact of job complexity and study design on situational and behavior description interview validity. International Journal of Selection and Assessment, 12 (3), 262273.Google Scholar
Hunter, J. E., & Schmidt, F. L. (2004). Methods of meta-analysis: Correcting for error and bias in research findings. Thousand Oaks, CA: Sage.Google Scholar
James, L. R., Demaree, R. G., & Mulaik, S. A. (1986). A note on validity generalization procedures. Journal of Applied Psychology, 71, 440450.Google Scholar
James, L. R., & McIntyre, H. H. (2010). Situational specificity and validity generalization. In Farr, J. L. & Tippins, N. T. (eds.), Handbook of employee selection (pp. 909920). New York: Routledge.Google Scholar
*Judge, T. A., Thoresen, C. J., Bono, J. E., & Patton, G. K. (2001). The job satisfaction–job performance relationship: A qualitative and quantitative review. Psychological Bulletin, 127 (3), 376407.Google Scholar
*Judge, T. A., Bono, J. E., Ilies, R., & Gerhardt, M. W. (2002). Personality and leadership: A qualitative and quantitative review. Journal of Applied Psychology, 87, 765780.Google Scholar
*Judge, T. A., & Piccolo, R. F. (2004). Transformational and transactional leadership: A meta-analytic test of their relative validity. Journal of Applied Psychology, 89 (5), 755768.Google Scholar
Kepes, S., Banks, G. C., McDaniel, M., & Whetzel, D. L. (2012). Publication bias in the organizational sciences. Organizational Research Methods, 15 (4), 624662.Google Scholar
Kromrey, J. D., & Rendina-Gobioff, G. (2006). On knowing what we do not know: An empirical comparison of methods to detect publication bias in meta-analysis. Educational and Psychological Measurement, 66 (3), 357373.Google Scholar
Lipsey, M. W., & Wilson, D. B. (2001). Practical meta-analysis. Thousand Oaks, CA: Sage.Google ScholarPubMed
*Lord, R. G., de Vader, C. L., & Alliger, G. M. (1986). A meta-analysis of the relation between personality traits and leadership perceptions: An application of validity generalization procedures. Journal of Applied Psychology, 71 (3), 402410.Google Scholar
*Lowe, K. B., Kroeck, K. G., & Sivasubramaniam, N. (1996). Effectiveness correlates of transformation and transactional leadership: A meta-analytic review of the MLQ literature. The Leadership Quarterly, 7 (3), 385425.Google Scholar
*Martin, R., Guillaume, Y., Thomas, G., Lee, A., & Epitropaki, O. (2016). Leader–member exchange (LMX) and performance: A meta-analytic review. Personnel Psychology, 69 (1), 67121.CrossRefGoogle Scholar
*McDaniel, M. A., Whetzel, D. L., Schmidt, F. L., & Maurer, S. D. (1994). The validity of employment interviews: A comprehensive review and meta-analysis. Journal of Applied Psychology, 79 (4), 599616.Google Scholar
McDaniel, M. A., Rothstein, H. R., & Whetzel, D. L. (2006). Publication bias: A case study of four test vendors. Personnel Psychology, 59 (4), 927953.CrossRefGoogle Scholar
*McDaniel, M. A., Hartman, N. S., Whetzel, D. L., & Grubb, W. L. (2007). Situational judgment tests, response instructions, and validity: A meta-analysis. Personnel Psychology, 60 (1), 6391.Google Scholar
Murphy, K. R. (2000). Impact of assessments of validity generalization and situational specificity on the science and practice of personnel selection. International Journal of Selection and Assessment, 8 (4), 194206.Google Scholar
Murphy, K. R. (2003). Validity generalization: A critical review. Mahwah, NJ: Erlbaum.Google Scholar
Nunnally, J. C. (1978). Psychometric theory (2nd ed.). New York: McGraw-Hill.Google Scholar
Oswald, F. L., & McCloy, R. A. (2003). Meta-analysis and the art of the average. In Murphy, K. (ed.), Validity generalization: A critical review (pp. 311338). Mahwah, NJ: Erlbaum.Google Scholar
Raju, N., Burke, M., Normand, J., & Langlois, G. (1991). A new meta-analytic approach. Journal of Applied Psychology, 76, 432446.Google Scholar
Rosenthal, R. (1979). The file drawer problem and tolerance for null results. Psychological Bulletin, 86, 638641.Google Scholar
*Rothstein, H. R., & McDaniel, M. A. (1992). Differential validity by sex in employment settings. Journal of Business and Psychology, 7 (1), 4562.Google Scholar
Sackett, P. R. (2003). The status of validity generalization research: Key issues in drawing inferences from cumulative research findings. In Murphy, K. (ed.), Validity generalization: A critical review (pp. 91114). Mahwah, NJ: Erlbaum.Google Scholar
Sackett, P. R., Harris, M. M., & Orr, J. M. (1986). On seeking moderator variables in the meta-analysis of correlational data: A Monte Carlo investigation of statistical power and resistance to type I error. Journal of Applied Psychology, 71 (2), 302310.Google Scholar
Sackett, P. R., Tenopyr, M. L., Schmitt, N., Kehoe, J., & Zedeck, S. (1985). Commentary on forty questions about validity generalizations and meta-analysis. Personnel Psychology, 38, 697798.Google Scholar
*Salgado, J. F., Anderson, N., Moscoso, S., Bertua, C., & de Fruyt, F. (2003). International validity generalization of GMA and cognitive abilities: A European community meta-analysis. Personnel Psychology, 56 (3), 573605.CrossRefGoogle Scholar
Schmidt, F. L., Gast-Rosenberg, I., & Hunter, J. E. (1980). Validity generalization results for computer programmers. Journal of Applied Psychology, 65, 643661.Google Scholar
Schmidt, F. L., & Hunter, J. E. (1977). Development of a general solution to the problem of validity generalization. Journal of Applied Psychology, 62, 529540.Google Scholar
Schmidt, F. L., & Hunter, J. E. (1978). Moderator research and the law of small numbers. Personnel Psychology, 31, 215231.Google Scholar
Schmidt, F. L., & Hunter, J. E. (1984). A within setting test of the situational specificity hypothesis in personnel selection. Personnel Psychology, 37, 317326.Google Scholar
Schmidt, F. L., Hunter, J. E., & Caplan, J. R. (1981). Validity generalization results for two jobs in the petroleum industry. Journal of Applied Psychology, 66, 261273.Google Scholar
Schmidt, F. L., Hunter, J. E., & Pearlman, K. (1981). Task differences and validity of aptitude tests in selection: A red herring. Personnel Psychology, 38, 697798.CrossRefGoogle Scholar
Schmidt, F. L., Hunter, J. E., Pearlman, K., & Rothstein-Hirsh, H. (1985). Forty questions about validity generalization and meta-analysis. Journal of Applied Psychology, 66, 166185.Google Scholar
*Schmidt, F. L., Hunter, J. E, Pearlman, K., & Shane, G. S. (1979). Further tests of the Schmidt-Hunter Bayesian validity generalization procedure. Personnel Psychology, 32, 257281.Google Scholar
Schmidt, F. L., Ocasio, B. P., Hillery, J. M., & Hunter, J. E. (1985). Further within-setting empirical tests of the situational specificity hypothesis in personnel selection. Personnel Psychology, 38, 509524.CrossRefGoogle Scholar
Steel, P. D. G., & Kammeyer-Mueller, J. (2008). Bayesian variance estimation for meta-analysis: Quantifying our uncertainty. Organizational Research Methods, 11 (1), 5478.Google Scholar
Steel, P., Kammeyer-Mueller, J., & Paterson, T. A. (2015). Improving the meta-analytic assessment of effect size variance with an informed Bayesian prior. Journal of Management, 41 (2), 718743.Google Scholar
Sterne, J. A. C., & Egger, M. (2005). Regression methods to detect publication and other bias in meta-analysis. In Rothstein, H., Sutton, A. J., & Borenstein, M. (eds.), Publication bias in meta-analysis: Prevention, assessment and adjustments (pp. 223240). Chichester, UK: Wiley.Google Scholar
Tett, R. P., & Christiansen, N. D. (2007). Personality tests at the crossroads: A response to Morgeson, Campion, Dipboye Hollenbeck, Murphy, and Schmitt (2007). Personnel Psychology, 60, 967993.CrossRefGoogle Scholar
Tett, R. P., Jackson, D. N., & Rothstein, M. (1991). Personality measures as predictors of job performance: A meta-analytic review. Personnel Psychology, 44, 703742.Google Scholar
Tett, R. P., Jackson, D. N., Rothstein, M., & Reddon, J. R. (1999). Meta-analysis of bi-directional relations in personality-job performance research, Human Performance, 12, 129.Google Scholar
*Tett, R. P., & Meyer, J. P. (1993). Job satisfaction, organizational commitment, turnover intention, and turnover: Path analyses based on meta-analytic findings. Personnel Psychology, 46, 259293.Google Scholar
van Assen, Marcel A. L. M., van Aert, Robbie C. M., & Wicherts, J. M. (2015). Meta-analysis using effect size distributions of only statistically significant studies. Psychological Methods, 20 (3), 293309.Google Scholar
Vinchur, A. J., Schippmann, J. S., Switzer, F. S., & Roth, P. L. (1998). A meta-analytic review of job performance for salespeople. Journal of Applied Psychology, 83, 586597.Google Scholar
Viswesvaran, C., & Ones, D. S. (1995). Theory testing: Combining psychometric meta-analysis and structural equations modeling. Personnel Psychology, 48 (4), 865885.Google Scholar
*Wiesner, W. H., & Cronshaw, S. F. (1988). A meta-analytic investigation of the impact of interview format and degree of structure on the validity of the employment interview. Journal of Occupational Psychology, 61 (4), 275290.Google Scholar
Whitener, E. M. (1990). Confusion of confidence intervals and credibility intervals in meta-analysis. Journal of Applied Psychology, 75, 315321.Google Scholar
*Zimmerman, R. D. (2008). Understanding the impact of personality traits on individuals' turnover decisions: A meta-analytic path model. Personnel Psychology, 61, 309348.Google Scholar