Skip to main content Accessibility help
×
Hostname: page-component-cd9895bd7-8ctnn Total loading time: 0 Render date: 2024-12-25T16:51:34.311Z Has data issue: false hasContentIssue false

27 - Meta-analysis

from Part IV - Understanding What Your Data Are Telling You About Psychological Processes

Published online by Cambridge University Press:  12 December 2024

Harry T. Reis
Affiliation:
University of Rochester, New York
Tessa West
Affiliation:
New York University
Charles M. Judd
Affiliation:
University of Colorado Boulder
Get access

Summary

Meta-analysis is the quantitative analysis of results of a research literature. Typically, meta-analysis is paired with a systematic review that fully documents the search process, inclusion and exclusion criteria, and study characteristics. A key feature of meta-analysis is the calculation of effect sizes – metric-free indices of study outcome that allow the mathematical combination of effects across studies. The methodological literature on meta-analysis has grown rapidly in recent years, yielding an abundance of resources and sophisticated analytic techniques. These developments are improvements to the field but can also be overwhelming to new aspiring meta-analysts. This chapter therefore aims to demystify some of that complexity, offering conceptual explanations instead of mathematical formulas. We aim to help readers who have not conducted a meta-analysis before to get started, as well as to help those who simply want to be intelligent consumers of published meta-analyses.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2024

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Albajes-Eizagirre, A., Solanes, A., and Radua, J. (2019). Meta-analysis of non-statistically significant unreported effects. Statistical Methods in Medical Research, 28(12), 37413754.CrossRefGoogle ScholarPubMed
American Psychological Association. (2020). Meta-analysis reporting standards (MARS), https://apastyle.apa.org/jars/quant-table-9.pdf.Google Scholar
Bartoš, F., Maier, M., Quintana, D. S., and Wagenmakers, E. J. (2022). Adjusting for publication bias in JASP and R: Selection models, PET-PEESE, and robust Bayesian meta-analysis. Advances in Methods and Practices in Psychological Science, 5(3), 25152459221109259.CrossRefGoogle Scholar
Becker, B. J. (2005). Failsafe N or file-drawer number. In Rothstein, H. R., Sutton, A. J., and Borenstein, M. (eds.) Publication Bias in Meta-analysis: Prevention, Assessment and Adjustments. John Wiley and Sons, Ltd.Google Scholar
Borenstein, M. (2019). Common Mistakes in Meta-analysis and How to Avoid Them. Biostat, Inc.Google Scholar
Borenstein, M., Hedges, L. V., Higgins, J. P. T., and Rothstein, H. R. (2009). Introduction to Meta-analysis. John Wiley and Sons.CrossRefGoogle Scholar
Borenstein, M., Higgins, J. P. T., Hedges, L. V., and Rothstein, H. R. (2017). Basics of meta-analysis: I2 is not an absolute measure of heterogeneity. Research Synthesis Methods, 8(1), 518.CrossRefGoogle Scholar
Bushman, B. J., and Wang, M. C. (2009). Vote-counting procedures in meta-analysis. In Cooper, H., Hedges, L. V., and Valentine, J. C. (eds.) The Handbook of Research Synthesis and Meta-analysis. Russell Sage Foundation.Google Scholar
Button, K. S., Ioannidis, J., Mokrysz, C., Nosek, B. A., Flint, J., Robinson, E. S., and Munafò, M. R. (2013). Power failure: Why small sample size undermines the reliability of neuroscience. Nature Reviews Neuroscience, 14(5), 365376.CrossRefGoogle ScholarPubMed
Carter, E. C., Schönbrodt, F. D., Gervais, W. M., and Hilgard, J. (2019). Correcting for bias in psychology: A comparison of meta-analytic methods. Advances in Methods and Practices in Psychological Science, 2(2), 115144.CrossRefGoogle Scholar
Cheung, M. W. L. (2014). Fixed‐ and random‐effects meta‐analytic structural equation modeling: Examples and analyses in R. Behavioral Research Methods, 46(1), 2940.CrossRefGoogle ScholarPubMed
Cinar, O., Umbanhowar, J., Hoeksema, J. D., and Viechtbauer, W. (2021). Using information‐theoretic approaches for model selection in meta‐analysis. Research Synthesis Methods, 12(4), 537556.CrossRefGoogle ScholarPubMed
Cleophas, T. J. M., and Zwinderman, A. H. (2017). Modern Meta-analysis: Review and Update of Methodologies. Springer.CrossRefGoogle Scholar
Coburn, K. M., and Vevea, J. L. (2019). Weightr: Estimating weight-function models for publication bias. R package version 2.0.2, https://CRAN.R-project.org/package=weightr.Google Scholar
Cohen, J. (1989). Statistical Power Analysis for the Behavioral Sciences, 2nd ed. Lawrence Erlbaum Associates.Google Scholar
Cooper, H., Hedges, L. V., and Valentine, J. C. (eds.). (2019). The Handbook of Research Synthesis and Meta-analysis. Russell Sage Foundation.CrossRefGoogle Scholar
Cooper, H. M. (2016). Research Synthesis and Meta-analysis: A Step-by-Step Approach, 5th ed. Sage.Google Scholar
Del Re, A. C., and Flückiger, C. (2016). Meta-analysis. In Norcross, J. C., VandenBos, G. R., Freedheim, D. K., and Olatunji, B. O. (eds.) APA Handbook of Clinical Psychology: Theory and Research. American Psychological Association.Google Scholar
Dickens, L. R., and Robins, R. W. (2022). Pride: A meta-analytic project. Emotion, 22(5), 10711087.CrossRefGoogle Scholar
DiMatteo, M. R. (2004). Social support and patient adherence to medical treatment: A meta-analysis. Health Psychology, 23, 207218.CrossRefGoogle ScholarPubMed
Donnelly, K., and Twenge, J. M. (2017). Masculine and feminine traits on the Bem Sex-Role Inventory, 1993–2012: A cross-temporal meta-analysis. Sex Roles, 76(9), 556565.CrossRefGoogle Scholar
Duval, S. (2005). The trim and fill method. In Rothstein, H. R., Sutton, A. J., and Borenstein, M. (eds.) Publication Bias in Meta-analysis: Prevention, Assessment and Adjustments. John Wiley and Sons.Google Scholar
Eagly, A. H., and Steffen, V. J. (1986). Gender and aggressive behavior: A meta-analytic review of the social psychological literature. Psychological Bulletin, 100(3), 309330.CrossRefGoogle ScholarPubMed
Eysenck, H. J. (1978). An exercise in mega-silliness. American Psychologist, 33(5), 517.CrossRefGoogle Scholar
Freudenberg, M., Albohn, D. N., Kleck, R. E., Adams, R. B., Jr., and Hess, U. (2020). Emotional stereotypes on trial: Implicit emotion associations for young and old adults. Emotion, 20(7), 12441254.CrossRefGoogle ScholarPubMed
Goh, J. X., Hall, J. A., and Rosenthal, R. (2016). Mini meta-analysis of your own studies: Some arguments on why and a primer on how. Social and Personality Psychology Compass, 10, 535549.CrossRefGoogle Scholar
Hall, J. A. (1978). Gender effects in decoding nonverbal cues. Psychological Bulletin, 85, 845–857.CrossRefGoogle Scholar
Hall, J. A. (2006). How big are nonverbal sex differences? The case of smiling and nonverbal sensitivity. In Dindia, K. and Canary, D. J. (eds.) Sex Differences and Similarities in Communication. Lawrence Erlbaum Associates Publishers.Google Scholar
Hall, J. A., Coats, E. J., and Smith LeBeau, L. (2005). Nonverbal behavior and the vertical dimension of social relations: A meta-analysis. Psychological Bulletin, 131, 898924.CrossRefGoogle ScholarPubMed
Hall, J. A., and Rosenthal, R. (2018). Choosing between random effects models in meta-analysis: Units of analysis and the generalizability of obtained results. Social and Personality Psychology Compass, 12(10), article e12414.CrossRefGoogle Scholar
Hedges, L. V. (2009). Statistical considerations. In Cooper, H., Hedges, L. V., and Valentine, J. C. (eds.) The Handbook of Research Synthesis and Meta-analysis, 2nd ed. Russell Sage Foundation.Google Scholar
Hedges, L. V., and Pigott, T. D. (2004). The power of statistical tests for moderators in meta-analysis. Psychological Methods, 9(4), 426445.CrossRefGoogle ScholarPubMed
Hedges, L. V., and Vevea, J. L. (1998). Fixed- and random-effects models in meta-analysis. Psychological Methods, 3(4), 486504.CrossRefGoogle Scholar
Higgins, J. P., Savović, J., Page, M. J., Elbers, R. G., and Sterne, J. A. (2019). Assessing risk of bias in a randomized trial. In Cochrane Handbook for Systematic Reviews of Interventions, 205228, at https://training.cochrane.org/handbook.Google Scholar
Iyengar, S., and Greenhouse, J. B. (1988). Selection models and the file drawer problem. Statistical Science, 3(1), 109135.Google Scholar
Johnson, B. T. (2021). Toward a more transparent, rigorous, and generative psychology. Psychological Bulletin, 147(1), 115.CrossRefGoogle Scholar
Johnson, B. T., and Eagly, A. H. (2014). Meta-analysis of research in social and personality psychology. In Reis, H. T. and Judd, C. M. (eds.) Handbook of Research Methods in Social and Personality Psychology. Cambridge University Press.Google Scholar
Kenny, D. A., and Judd, C. M. (2019). The unappreciated heterogeneity of effect sizes: Implications for power, precision, planning of research, and replication. Psychological Methods, 24(5), 578589.CrossRefGoogle ScholarPubMed
Konrath, S. H., O’Brien, E. H., and Hsing, C. (2011). Changes in dispositional empathy in American college students over time: A meta-analysis. Personality and Social Psychology Review, 15(2), 180198.CrossRefGoogle ScholarPubMed
Kossmeier, M., Tran, U. S., and Voracek, M. (2020). Charting the landscape of graphical displays for meta-analysis and systematic reviews: A comprehensive review, taxonomy, and feature analysis. BMC Medical Research Methodology, 20(1), 124.CrossRefGoogle ScholarPubMed
Kvarven, A., Strømland, E., and Johannesson, M. (2019). Comparing meta-analyses and preregistered multiple-laboratory replication projects. Nature Human Behavior, 4, 423434.CrossRefGoogle ScholarPubMed
Lakens, D. (2013). Calculating and reporting effect sizes to facilitate cumulative science: A practical primer for t-tests and ANOVAs. Frontiers in Psychology, 4(26), 863, DOI:10.3389/fpsyg.2013.00863.CrossRefGoogle Scholar
Lakens, D., Hilgard, J., and Staaks, J. (2016). On the reproducibility of meta-analyses: Six practical recommendations. BMC Psychology, 4(1), 110.CrossRefGoogle ScholarPubMed
Lipsey, M. W., and Wilson, D. B. (1993). The efficacy of psychological, educational, and behavioral treatment: Confirmation from meta-analysis. American Psychologist, 48(12), 11811209.CrossRefGoogle ScholarPubMed
Lipsey, M. W., and Wilson, D. B. (2001). Practical Meta-analysis. Sage.Google ScholarPubMed
López‐López, J. A., Page, M. J., Lipsey, M. W., and Higgins, J. P. T. (2018). Dealing with effect size multiplicity in systematic reviews and meta‐analyses. Research Synthesis Methods, 9(3), 336351.CrossRefGoogle Scholar
López‐López, J. A., van den Noortgate, W., Tanner‐Smith, E. E., Wilson, S. J., and Lipsey, M. W. (2017). Assessing meta‐regression methods for examining moderator relationships with dependent effect sizes: A Monte Carlo simulation. Research Synthesis Methods, 8(4), 435450.CrossRefGoogle ScholarPubMed
Lovakov, A., and Agadullina, E. R. (2021). Empirically derived guidelines for effect size interpretation in social psychology. European Journal of Social Psychology, 51(3), 485504.CrossRefGoogle Scholar
McShane, B. B., Böckenholt, U., and Hansen, K. T. (2016). Adjusting for publication bias in meta-analysis: An evaluation of selection methods and some cautionary notes. Perspectives on Psychological Science, 11(5), 730749.CrossRefGoogle ScholarPubMed
Marks‐Anglin, A., and Chen, Y. (2020). A historical review of publication bias. Research Synthesis Methods, 11(6), 725742.CrossRefGoogle ScholarPubMed
Mathur, M. B., and VanderWeele, T. J. (2020). Sensitivity analysis for publication bias in meta‐analyses. Journal of the Royal Statistical Society: Series C (Applied Statistics), 69(5), 10911119.Google ScholarPubMed
Mathur, M. B., and VanderWeele, T. J. (2021). Estimating publication bias in meta‐analyses of peer‐reviewed studies: A meta‐meta‐analysis across disciplines and journal tiers. Research Synthesis Methods, 12(2), 176191.CrossRefGoogle ScholarPubMed
Miller, D. I., Nolla, K. M., Eagly, A. H., and Uttal, D. H. (2018). The development of children’s gender-science stereotypes: A meta-analysis of five decades of U.S. Draw-A-Scientist studies. Child Development, 89(6), 19431955.CrossRefGoogle Scholar
Moher, D., Liberati, A., Tetzlaff, J., Altman, D. G., and the Group, PRISMA. (2009) Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Med, 6(7), e1000097.CrossRefGoogle ScholarPubMed
Mosteller, F. M., and Bush, R. R. (1954). Selected quantitative techniques. In Lindzey, G. (ed.) Handbook of Social Psychology, vol. 1, Theory and Method. Addison-Wesley.Google Scholar
Collaboration, Open Science. (2015). Estimating the reproducibility of psychological science. Science, 349(6251), aac4716.CrossRefGoogle Scholar
Page, M. J., Moher, D., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., … McKenzie, J. E. (2021). PRISMA 2020 explanation and elaboration: Updated guidance and exemplars for reporting systematic reviews. BMJ, 372, n160.CrossRefGoogle ScholarPubMed
Page, M. J., Moher, D., and McKenzie, J. E. (2022). Introduction to PRISMA 2020 and implications for research synthesis methodologists. Research Synthesis Methods, 13(2), 156163.CrossRefGoogle ScholarPubMed
Polanin, J. R., Hennessy, E. A., and Tanner-Smith, E. E. (2017). A review of meta-analysis packages in R. Journal of Educational and Behavioral Statistics, 42(2), 206242.CrossRefGoogle Scholar
Polanin, J. R., Hennessy, E. A., and Tsuji, S. (2020). Transparency and reproducibility of meta-analyses in psychology: A meta-review. Perspectives on Psychological Science, 15(4), 10261041.CrossRefGoogle ScholarPubMed
Polanin, J. R., Pigott, T. D., Espelage, D. L., and Grotpeter, J. K. (2019). Best practice guidelines for abstract screening large‐evidence systematic reviews and meta‐analyses. Research Synthesis Methods, 10(3), 330342.CrossRefGoogle Scholar
Pustejovsky, J. E., and Rodgers, M. A. (2019). Testing for funnel plot asymmetry of standardized mean differences. Research Synthesis Methods, 10(1), 5771.CrossRefGoogle ScholarPubMed
Pustejovsky, J. E., and Tipton, E. (2022). Meta-analysis with robust variance estimation: Expanding the range of working models. Prevention Science, 23, 425438.CrossRefGoogle ScholarPubMed
Rathbone, J., Hoffmann, T., and Glasziou, P. (2015). Faster title and abstract screening? Evaluating Abstrackr, a semi-automated online screening program for systematic reviewers. Systematic Reviews, 4(1), 17.CrossRefGoogle ScholarPubMed
Razpurker-Apfeld, I., and Shamoa-Nir, L. (2021). Is an outgroup welcome with open arms? Approach and avoidance motor activations and outgroup prejudice. Journal of Experimental Psychology: Applied, 27(2), 417429.Google ScholarPubMed
Rosenthal, R. (1966). Experimenter Effects in Behavioral Research. Appleton-Century-Crofts.Google Scholar
Rosenthal, R. (1979). The file drawer problem and tolerance for null results. Psychological Bulletin, 86(3), 638641.CrossRefGoogle Scholar
Rosenthal, R. (1994). Parametric measures of effect size. In Cooper, H. and Hedges, L. V. (eds.) The Handbook of Research Synthesis. Russell Sage Foundation.Google Scholar
Rosenthal, R. (1995). Writing meta-analytic reviews. Psychological Bulletin, 118, 183192.CrossRefGoogle Scholar
Rosenthal, R., Rosnow, R. L., and Rubin, D. B. (2000). Contrasts and Effect Sizes in Behavioral Research: A Correlational Approach. Cambridge University Press.Google Scholar
Rosenthal, R., and Rubin, D. B. (1979). Comparing significance levels of independent studies. Psychological Bulletin, 86, 11651168.CrossRefGoogle Scholar
Schild, A. H., and Voracek, M. (2015). Finding your way out of the forest without a trail of bread crumbs: development and evaluation of two novel displays of forest plots. Research Synthesis Methods, 6(1), 7486.CrossRefGoogle ScholarPubMed
Schlegel, K., Boone, R. T., and Hall, J. A. (2017). Individual differences in interpersonal accuracy: A multi-level meta-analysis to assess whether judging other people is one skill or many. Journal of Nonverbal Behavior, 41, 103137.CrossRefGoogle Scholar
Schlegel, K., Palese, T., Mast, M. S., Rammsayer, T. H., Hall, J. A., and Murphy, N. A. (2020). A meta-analysis of the relationship between emotion recognition ability and intelligence. Cognition and Emotion, 34(2), 329351.CrossRefGoogle ScholarPubMed
Schmid, C. H., Stijnen, T., and White, I. R. (eds.) (2021). Handbook of Meta-analysis. Routledge/Taylor and Francis Group.Google Scholar
Schmidt, F. L., Le, H., and Oh, I. (2009). Correcting for the distorting effects of study artifacts in meta-analysis. In Cooper, H., Hedges, L. V., and Valentine, J. C. (eds.) The Handbook of Research Synthesis and Meta-analysis, 2nd ed. Russell Sage Foundation.Google Scholar
Schmidt, F. L., Oh, I.-S., and Hayes, T. L. (2009). Fixed- versus random-effects models in meta-analysis: Model properties and an empirical comparison of differences in results. British Journal of Mathematical and Statistical Psychology, 62(1), 97128.CrossRefGoogle Scholar
Shuster, J. J, Guo, J. D., and Skyler, J. S. (2012). Meta‐analysis of safety for low event‐rate binomial trials. Research Synthesis Methods, 3, 3050.CrossRefGoogle ScholarPubMed
Siddaway, A. P., Wood, A. M., and Hedges, L. V. (2019). How to do a systematic review: A best practice guide for conducting and reporting narrative reviews, meta-analyses, and meta-syntheses. Annual Review of Psychology, 70, 747–770.CrossRefGoogle Scholar
Siegel, M., Eder, J. S. N., Wicherts, J. M., and Pietschnig, J. (2021). Times are changing, bias isn’t: A meta-meta-analysis on publication bias detection practices, prevalence rates, and predictors in industrial/organizational psychology. Journal of Applied Psychology, 107(11), 20132039.CrossRefGoogle ScholarPubMed
Simonsohn, U., Nelson, L. D., and Simmons, J. P. (2014). P-curve: A key to the file-drawer. Journal of Experimental Psychology: General, 143(2), 534547.CrossRefGoogle Scholar
Simonsohn, U., Simmons, J., and Nelson, L. (2017, June 15). Why p-curve excludes ps > .05 (blog post), https://datacolada.org/61.+.05+(blog+post),+https://datacolada.org/61.>Google Scholar
Simonsohn, U., Simmons, J., and Nelson, L. (2018, January 8). P-curve handles heterogeneity just fine (blog post), https://datacolada.org/67.Google Scholar
Smith, M. L., and Glass, G.V. (1977). Meta-analysis of psychotherapy outcome studies. American Psychologist, 32(9), 752760.CrossRefGoogle ScholarPubMed
Stanley, T. D., and Doucouliagos, H. (2014). Meta‐regression approximations to reduce publication selection bias. Research Synthesis Methods, 5(1), 6078.CrossRefGoogle ScholarPubMed
Sterne, J. A. C., Becker, B. J., and Egger, M. (2005). The funnel plot. In Rothstein, H. R., Sutton, A. J., and Borenstein, M. (eds.) Publication Bias in Meta-analysis: Prevention, Assessment and Adjustments. John Wiley & Sons.Google Scholar
Tanner-Smith, E. E., and Tipton, E. (2014). Robust variance estimation with dependent effect sizes: Practical considerations including a software tutorial in Stata and SPSS. Research Synthesis Methods, 5(1), 1330.CrossRefGoogle ScholarPubMed
Tanner-Smith, E. E., Tipton, E., and Polanin, J. R. (2016). Handling complex meta-analytic data structures using robust variance estimates: A tutorial in R. Journal of Developmental and Life-Course Criminology, 2, 85112.CrossRefGoogle Scholar
Taylor, J. A., Pigott, T., and Williams, R. (2022). Promoting knowledge accumulation about intervention effects: Exploring strategies for standardizing statistical approaches and effect size reporting. Educational Researcher, 51(1), 7280.CrossRefGoogle Scholar
Thomas, J., and Harden, A. (2008). Methods for the thematic synthesis of qualitative research in systematic reviews. BMC Medical Research Methodology, 8, 45, at https://bmcmedresmethodol.biomedcentral.com/articles/10.1186/1471-2288-8-45.CrossRefGoogle ScholarPubMed
Tipton, E., Pustejovsky, J. E., and Ahmadi, H. (2019). A history of meta-regression: Technical, conceptual, and practical developments between 1974 and 2018. Research Synthesis Methods, 10(2), 161179.CrossRefGoogle ScholarPubMed
Tucker-Drob, E. M., Brandmaier, A. M., and Lindenberger, U. (2019). Coupled cognitive changes in adulthood: A meta-analysis. Psychological Bulletin, 145(3), 273301.CrossRefGoogle ScholarPubMed
Valentine, J. C. (2009). Judging the quality of primary research. In Cooper, H., Hedges, L. V., and Valentine, J. C. (eds.) The Handbook of Research Synthesis and Meta-analysis, 2nd ed. Russell Sage Foundation.Google Scholar
Valentine, J. C. (2012). Meta-analysis. In Cooper, H., Camic, P. M., Long, D. L., Panter, A. T., Rindskopf, D., and Sher, K. J. (eds.). APA Handbook of Research Methods in Psychology, vol. 3. American Psychological Association.Google Scholar
Valentine, J. C., Pigott, T. D., and Rothstein, H. R. (2010). How many studies do you need? A primer on statistical power for meta-analysis. Journal of Educational and Behavioral Statistics, 35(2), 215247.CrossRefGoogle Scholar
van Aert, R. C. M., Wicherts, J. M., and van Assen, M. A. L. M. (2016). Conducting meta-analyses based on p-values: Reservations and recommendations for applying p-uniform and p-curve. Perspectives on Psychological Science, 11(5), 713729.CrossRefGoogle ScholarPubMed
van Lissa, C. J. (2020). Small sample meta-analyses: Exploring heterogeneity using MetaForest. In Van De Schoot, R. and Miočević, M. (eds.) Small sample size solutions (open access): A guide for applied researchers and practitioners. CRC Press, www.crcpress.com/Small-Sample-SizeSolutions-Open-Access-A-Guide-for-Applied-Researchers/Schoot-Miocevic/p/book/9780367222222.Google Scholar
Vevea, J. L. and Hedges, L. V. (1995). A general linear model for estimating effect size in the presence of publication bias. Psychometrika, 60(3), 419435.CrossRefGoogle Scholar
Viechtbauer, W. (2010). Conducting meta-analyses in R with the metafor package. Journal of Statistical Software, 36(3), 148.CrossRefGoogle Scholar
Waffenschmidt, S., Knelangen, M., Sieben, W., Bühn, S., and Pieper, D. (2019). Single screening versus conventional double screening for study selection in systematic reviews: A methodological systematic review. BMC Medical Research Methodology, 19(1), 19.CrossRefGoogle ScholarPubMed
Wells, G., Shea, B., O’Connell, D., Peterson, J., Welch, V., Losos, M., and Tugwell, P. (2000). The Newcastle–Ottawa Scale (NOS) for assessing the quality of non-randomized studies in meta-analysis, www.ohri.ca/programs/clinical_epidemiology/oxford.asp.Google Scholar
Clearinghouse, What Works. (2022). What Works Clearinghouse Procedures and Standards Handbook, Version 5.0. National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education, https://ies.ed.gov/ncee/wwc/handbooks.Google Scholar
White, H. D. (2009). Scientific communication and literature retrieval. In Cooper, H., Hedges, L. V., and Valentine, J. C. (eds.) The Handbook of Research Synthesis and Meta-analysis, 2nd ed. Russell Sage Foundation.Google Scholar
Zuckerman, M., Li, C., and Hall, J. A. (2016). When men and women differ in self-esteem and when they don’t: A meta-analysis. Journal of Research in Personality, 64, 3451.CrossRefGoogle Scholar
Zuckerman, M., Silberman, J., and Hall, J. A. (2013). The relation between intelligence and religiosity: A meta-analysis and some proposed explanations. Personality and Social Psychology Review, 17, 325354.CrossRefGoogle ScholarPubMed

Save book to Kindle

To save this book to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×