This journal supports best practices in research openness and transparency. The policies below outline our expectations for authors to help ensure that the research results we publish are as transparent as possible.
Approach to reproducibility
Following Hofner et al. (2016), Research Synthesis Methods will implement the following quality control steps in order to promote reproducible research. The standard review process focuses on the methods described in a manuscript and does not systematically evaluate the reproducibility of results. All articles that have been accepted in the regular review process will be inspected by the responsible Editor in Chief for software code, including simulation code. Articles with code will be referred to a reproducible research editor (RRE) who is responsible for the code evaluation. For common statistical software, either the RRE or a voluntary expert will check whether results can be reproduced. The authors will be contacted in case of failures to execute the code or major deviations in results. If there are persistent problems, the Editors may engage in a discussion with the RRE and the authors. If there are any serious doubts regarding the validity of the results, the Editors may decide to withdraw publication of the code or the entire article. In the case where authors use uncommon software, the RRE may not be able to evaluate the code and this will be noted in the publication.
Hofner B, Schmid M, Edler L. Reproducible research in statistics: A review and guidelines for the Biometrical Journal. Biometrical Journal. 2016;58(2):416-27.
Data availability
All manuscripts submitted to this journal must contain a Data Availability Statement, explaining where and how readers can access the data underpinning the research published in the manuscript.
We require all data underpinning your research to be made available to readers through an appropriate repository. In particular, repositories that provide persistent identifiers and have robust preservation policies will help to ensure the long-term integrity of published research. This policy applies to both quantitative data and qualitative materials.
It is not acceptable for your Data Availability Statement to say that data are “available on request” from the authors. If you cannot make your data publicly available due to ethical or legal concerns, please contact the editorial office to discuss an exemption to this policy.
Programming code policy
Authors are required to provide any previously unreported custom computer code used to generate the results described in the manuscript, including code for simulation studies. If there are issues preventing code sharing, the editors will review each case individually. Small amounts of source code can be included in the supplementary material. Authors should make larger amounts of code available in an open repository and include an OSI approved open source license. Code in a repository or with a formal DOI may be cited and listed in the References section of the manuscript.
If your code is stored and managed in GitHub, please make use of GitHub’s integration with Zenodo to create an archive of your code at the time of manuscript submission. This ensures that other researchers can access a version of your code as it was at the time you published your research, even if you later make changes in GitHub. You will receive a Zenodo DOI so that your code can be formally cited by others.
Citing data and other materials
We encourage authors to cite any materials and data they have used in their research, alongside literature citations, to recognise the importance of all kinds of research outputs.
Open Practice Badges
We recognise best practices in open research by awarding Open Practice Badges to authors who openly share the data and materials underpinning their research, or who have preregistered their research plans.
Badges are awarded by author declaration. You will be asked during the submission process to confirm whether or not you have met the criteria for each badge.