In last week’s Nature and Science, the outcome of a meeting convened by NIH, Nature, and Science to discuss the issue of lack of reproducibility in the basic science research literature was published. This meeting, which brought together representatives from publishers (including PLOS), and many representatives from the NIH and other funders, produced a series of principles, Proposed Principles and Guidelines for Reporting Preclinical Research, which were endorsed by a large and diverse group of publishers, associations, and societies including ourselves. The main principles are as follows:
- Rigorous statistical analysis
- Transparency in reporting
- Data and material sharing
- Consideration of refutations
- Consider establishing best practice guidelines for image based data and descriptions of biological data.
Everyone who works in research, as scientist, editor or funder knows that trust is a critical component of having confidence in scientific advances. It would be fair to say that until quite recently it was accepted that the somewhat unstructured narrative style of journal articles with no or little associated data along with few other accepted standard practices was considered sufficient reporting. However, both an increasing number of high profile articles which were found to be unreliable, together with a more general unease, for example around a lack of availability of data, has led journals to conclude that it is now imperative to be much more specific about what an article needs to contain in order for it to be reproducible.
These issues have long been discussed in the medical publishing world and are the rationale behind two long standing initiatives – the ICMJE’s Recommendations for the Conduct, Reporting, Editing and Publication of Scholarly Work in Medical Journals and the many reporting guidelines collected together by the EQUATOR initiative,
Hence, these new principles build on the momentum within the medical literature and are to welcomed as a baseline that all biomedical journals should now be able to adhere to in the basic science literature; they will be a concrete move towards increasing the reproducibility of papers.
However, as with all principles that are put together by committees made up of different groups with their own agenda, they do not go as far as any one party might like. As a publisher that straddles science and medicine, PLOS’s own editorial policies already go some way beyond what is currently being suggested. In two specific areas – data availability and publication of refutations – we hope that other journals move rapidly toward more robust requirements.
Availability of the data underlying a published study is probably the most significant way in which journals can, now, ensure reproducibility of the published literature. A more robust position on data availability, as PLOS has recently pushed for with its data policy, which requires deposition of data in a publicly available repository at the point of publication, would seem possible now.
On refutations, PLOS ONE has been leading the charge to get these papers published for many years. The ICMJE requirements have long noted in the need for medical journals to publish refutations, but in practice getting refutations, failed replications or negative findings published is hard. Although the reasons for this are complex, there is no doubt that in a subscription-based world there were very specific disincentives for journals doing so. In Open Access publishing by contrast, these financial disincentives disappear and journals potentially are much freer to consider such papers. It will be interesting to see whether journals do in practice follow this recommendation.
What’s next? It will be critical that journals move now to operationalize the recommendations made in this document into their workflows. This is no small feat. It will require creating tools for reviewers and editors to assess papers in a systematic, consistent fashion – for example structured review would be a good step in the right direction, although we know that reviewers are not keen on having to tick boxes and justify decisions in any kind of structured way – so journals need to figure out how to bring reviewers with them on this. Furthermore, implementing at scale is even more difficult for large publishers/journals.
We applaud the work that has been done in the recent meeting. These principles and attributes of sound science reporting will serve as critical cornerstones for long term change for the entire industry and will require deep adoption across the industry.