Hostname: page-component-745bb68f8f-b6zl4 Total loading time: 0 Render date: 2025-01-08T05:12:35.919Z Has data issue: false hasContentIssue false

The replication crisis, the rise of new research practices and what it means for experimental economics

Published online by Cambridge University Press:  01 January 2025

Lionel Page*
Affiliation:
University of Technology Sydney, Sydney, NSW, Australia
Charles N. Noussair
Affiliation:
Department of Economics, University of Arizona, Tucson, AZ, USA
Robert Slonim
Affiliation:
University of Technology Sydney, Sydney, NSW, Australia

Abstract

The replication crisis across several disciplines raises challenges for behavioural sciences in general. In this report, we review the lessons for experimental economists of these developments. We present the new research methods and practices which are being proposed to improve the replicability of scientific studies. We discuss how these methods and practices can have a positive impact in experimental economics and the extent to which they should be encouraged.

Type
Original Paper
Copyright
Copyright © 2021 The Author(s), under exclusive licence to Economic Science Association

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Brodeur, A., , M., Sangnier, M., Zylberberg, Y. (2016). Star wars: the empirics strike back. American Economic Journal: Applied Economics, 8(1), 132.Google Scholar
Brodeur, A., , M., Sangnier, M., Zylberberg, Y., (2013). Star wars: the empirics strike back. IZA Discussion Paper No. 7268.CrossRefGoogle Scholar
Camerer, C. F., Dreber, A., Forsell, E., Ho, T. H., Huber, J., Johannesson, M., Kirchler, M., Almenberg, J., Altmejd, A., Chan, T., Heikensten, E. (2016). Evaluating replicability of laboratory experiments in economics. Science, 351(6280), 14331436. 10.1126/science.aaf0918CrossRefGoogle ScholarPubMed
Christensen, G., Miguel, E. (2018). Transparency, reproducibility, and the credibility of economics research. Journal of Economic Literature, 56(3), 920980. 10.1257/jel.20171350CrossRefGoogle Scholar
Coffman, L. C., Niederle, M. (2015). Pre-analysis plans have limited upside, especially where replications are feasible. Journal of Economic Perspectives, 29(3), 8198. 10.1257/jep.29.3.81CrossRefGoogle Scholar
Dufwenberg, M., Martinsson, P. (2014). Keeping researchers honest: The case for sealed-envelopesubmissions. IGIER (Innocenzo Gasparini Institute for Economic Research), 533.Google Scholar
Duyx, B., Urlings, M. J., Swaen, G. M., Bouter, L. M., Zeegers, M. P. (2017). Scientific citations favor positive results: a systematic review and meta-analysis. Journal of Clinical Epidemiology, 88, 92101. 10.1016/j.jclinepi.2017.06.002CrossRefGoogle ScholarPubMed
Ioannidis, J. P. (2005). Why most published research findings are false. PLoS Medicine, 2(8),e124. 10.1371/journal.pmed.0020124CrossRefGoogle ScholarPubMed
Kasy, M. (2021). Of forking paths and tied hands: Selective publication of findings, and what economists should do about it. Journal of Economic Perspectives, 35(3), 175192. 10.1257/jep.35.3.175CrossRefGoogle Scholar
Maniadis, Z., Tufano, F., List, J. A. (2015). How to make experimental economics research more reproducible: Lessons from other disciplines and a new proposal Replication in experimental economics, Emerald Group Publishing Limited.Google Scholar
Olken, B. A. (2015). Promises and perils of pre-analysis plans. Journal of Economic Perspectives, 29(3), 6180. 10.1257/jep.29.3.61CrossRefGoogle Scholar
Simmons, J. P., Nelson, L. D., Simonsohn, U. (2018). False-positive citations. Perspectives on Psychological Science, 13(2), 255259. 10.1177/1745691617698146CrossRefGoogle ScholarPubMed