Promoting reproducibility by emphasizing reporting: PLOS ONE’s approach
As we celebrate PLOS ONE’s ten year anniversary, we continue our commitment to uphold rigorous standards for publications across all scientific disciplines. This is not a small goal. Several years ago, the staff editors at PLOS ONE posted an Editorial Highlight stressing the journal’s unwavering emphasis on ensuring a high standard of reporting in our publications. Since that post, we have continued to evolve our processes, refining our high technical and ethical standards in response to needs within the scientific community. In parallel, there has been a rising emphasis on reproducibility and transparency in research, as illustrated by such milestones as the 2014 NIH workshop on Principles and Guidelines for Reporting Preclinical Research and the Open Science Collaboration’s failure to fully replicate many key results in psychology .
From its inception, PLOS ONE has been a leader in advancing reproducible research, as exemplified by our encouragement of negative results and replication studies. In 2014, an updated Data Availability Policy was implemented across all PLOS journals to encourage validation, reanalysis, and replication of the findings from published studies, addressed in a recent EveryONE post. However, perhaps less frequently discussed is our continuing encouragement and enforcement of quality reporting, including the use of reporting guidelines.
Reporting guidelines as tools to enhance reproducibility
Typically formatted as checklists, reporting guidelines have now been formalized for many study types and delineate the key information necessary to clearly describe how a study was conducted and how the results were interpreted. They act as a guide for authors when composing their manuscript, a tool for editors and reviewers to facilitate efficient assessment, and an indication of accountability for readers. Research on the efficacy of reporting guidelines suggests that their use can improve publication quality. In articles describing clinical trials, the completeness of reporting was improved when journals endorsed the CONSORT guidelines  and when authors used the associated TREND guidelines independent of journal endorsement . Similar analyses of systematic reviews and meta-analyses have reported an association between thorough reporting using the PRISMA checklist and an increased article citation rate .
However, the adoption of reporting guidelines across the publishing industry has been inconsistent. In a 2016 feature published in Nature News, 69% of the 1,576 researchers surveyed thought that “journal editors enforcing standards to enhance reproducibility (e.g. through checklists)” was likely or very likely to improve the reproducibility of research. However, only 38% of respondents indicated that they had “encountered efforts from journal publishers designed to enhance or ensure standards for describing research methods.” Even in cases where journals encourage the use of checklists, there may not be a significant improvement in the quality of reporting when reporting guidelines are not strictly required, both for health research  and animal research studies [6,7].
The enforcement of reporting guidelines by journals could lead to improvements across the publishing arena, though there are challenges to overcome to ensure that journals have the resources necessary to implement and assess checklists. In addition, some authors may feel that including checklists and other requested information adds to the necessary preparations when submitting a manuscript. However, as the consistent use of standard checklists increases, the slight extra work will likely pay off by facilitating a more efficient assessment by editors and reviewers and thus speeding up the review process.
PLOS ONE’s approach to fostering reproducibility
At PLOS ONE, one of our criteria for publication requires that results be rigorously reported as defined by community standards, which may include the incorporation of study type-specific checklists. Thorough reporting not only facilitates reproducibility but also enables an efficient assessment of whether a study meets our other publication criteria, including whether the study was performed to a high technical standard, whether the conclusions are supported by the data, and whether the research was ethically conducted.
To enforce our reporting requirements, journal staff assess manuscripts for completeness at different points during the editorial process. This includes the initial evaluation of every submission by a staff editor in which we look for reporting quality, as well as adherence to our other editorial policies. As a part of these checks, we determine whether the manuscript reports primary research and whether any of the content requires specific evaluation, such as if misuse of the research could lead to biosecurity concerns or if the authors have declared or undeclared competing interests that could interfere with the objective presentation of the research. If the work involves human subjects or animals, we evaluate whether the research was conducted to a high ethical standard and was approved by an appropriate ethics committee. We also assess the quality of language and the presence of any text overlap with previously published work. Only manuscripts that meet our standards, approximately 89% of submissions, are sent on to members of our Editorial Board to begin the review process.
To help us determine whether a study has been sufficiently reported, PLOS ONE requires that checklists be provided at submission for clinical trials and systematic reviews/meta-analyses, which must provide the CONSORT/TREND and PRISMA checklists, respectively. In addition, we have developed our own reporting requirements for meta-analyses of genetic association studies. For other types of studies, we encourage authors to include associated checklists and may specifically request their inclusion, particularly if we feel reporting improvements are necessary to allow for an adequate review of the manuscript.
The ongoing evolution of PLOS ONE’s reporting requirements
An advantage of the broad scope and large volume of PLOS ONE is that the staff editors are attuned to patterns in the types of studies that can have deficient reporting. In response to what we see reflected in submissions, the staff editors work together with our Editorial Board, Human Research Advisory Group, and Animal Research Advisory Group to form new policies and guidelines. For example, we recently posted reporting requirements for studies in which death is used an experimental endpoint for regulated animals. In this case, the guidelines were established in response to concerns from the community, including Academic Editors, reviewers, and advisory group members, about the poor quality of reporting in some submissions with this study design. The development of reporting requirements and workflows for specific study types enables PLOS ONE to maintain our high reporting and ethics standards at scale and for the many diverse disciplines within our scope.
The goal of enhancing reproducibility by improving reporting practices is consistent with initiatives across all of the PLOS journals. We do not impose any word limit on Methods sections, and we encourage the inclusion of supporting materials with manuscript submissions. In addition, we recently announced a new partnership with protocols.io to enable a direct link of published work with the detailed laboratory protocols used to obtain the results. We also encourage submissions that present evidence on the efficacy of reporting methods, which may be used to further improve peer review and publishing practices. For more information in this area, we encourage exploration of the publications in the PLOS Collections on Meta-Research: Reporting.
As PLOS ONE embarks on its second decade, we continue our commitment to publish thoroughly reported research to enable reproducibility, and thus scientific progress. We look forward to continuing to serve the scientific community and respond to the needs within specific fields by developing and adopting new reporting standards.
- Open Science Collaboration. Estimating the reproducibility of psychological science. Science. 2015 Aug 28;349(6251).
- Turner L, Shamseer L, Altman DG, Schulz KF, Moher D. Does use of the CONSORT Statement impact the completeness of reporting of randomised controlled trials published in medical journals? A Cochrane review. Systematic reviews. 2012 Nov 29;1(1):60.
- Fuller T, Peters J, Pearson M, Anderson R. Impact of the transparent reporting of evaluations with nonrandomized designs reporting guideline: Ten years on. American journal of public health. 2014 Nov;104(11):e110-7.
- van der Pol CB, McInnes MD, Petrcich W, Tunis AS, Hanna R. Is quality and completeness of reporting of systematic reviews and meta-analyses published in high impact radiology journals associated with citation rates?. PLOS ONE. 2015 Mar 16;10(3):e0119892.
- Stevens A, Shamseer L, Weinstein E, Yazdi F, Turner L, Thielman J, Altman DG, Hirst A, Hoey J, Palepu A, Schulz KF. Relation of completeness of reporting of health research to journals’ endorsement of reporting guidelines: systematic review. BMJ. 2014 Jun 25;348.
- Carbone L, Austin J. Pain and laboratory animals: Publication practices for better data reproducibility and better animal welfare. PLOS ONE. 2016 May 12;11(5):e0155001.
- Jilka RL. The road to reproducibility in animal research. Journal of Bone and Mineral Research. 2016 Jul 1;31(7):1317-9.
Featured Image: Marcin Wichary from Flickr under CC-BY 2.0