Skip to content

When you choose to publish with PLOS, your research makes an impact. Make your work accessible to all, without restrictions, and accelerate scientific discovery with options like preprints and published peer review that make your work more Open.

PLOS BLOGS EveryONE

An interview with Ben Brown, Guest Editor of the PLOS ONE-COS Cognitive Psychology Collection

PLOS ONE, in collaboration with the Center for Open Science, recently launched a Cognitive Psychology Collection. It includes submissions to a Call for Papers in cognitive developmental psychology across the lifespan, with an emphasis on open science—transparent reporting practices such as pre-registration or iterative registration; data, code, and material sharing; and preprint posting. 

Ben Brown was one of three Guest Editors for this project, along with Nivedita Mani and Ramesh Kumar Mishra. Ben is Associate Professor of Psychology at Georgia Gwinnett College in Georgia, USA. Ben’s research interests are in developmental psychology: he has worked on autobiographical memory in populations, for instance in populations with autism spectrum disorder, and on children’s susceptibility to suggestion. 

Benjamin Brown, Guest Editor for the Cognitive Psychology Collection

Ben also has a long-standing interest in open science and the reproducibility and replicability of psychology research: he is a founding member of PsyArXiv, the preprint repository for the psychological sciences hosted by COS, and is a Senior Editor at Collabra: Psychology, the open-access journal of the Society for the Improvement of Psychological Sciences

I asked Ben about his editorial experience for this collection and his advocacy for open science more broadly.

Can you tell us about your interest in open science, what drew you to it and how that affects your own research?

My interest in scientific rigor and transparency began during my graduate training. During this time, I struggled to replicate well-known and highly regarded findings and found myself frustrated with the lack of transparent reporting in psychological research. As a result, I was eager for opportunities to contribute to improving psychological science.

When I learned of community efforts to address these same challenges I had faced in my own work, I happily and without reservation got involved. In doing so, I found a strong sense of camaraderie with other psychologists working on the issues that I felt so isolated grappling with in graduate school.

Preregistrations, including any modifications, help reviewers contextualize results and consider matters such as researchers’ degrees of freedom. I honestly would find it difficult to go back to a more traditional editorial experience.

Ben Brown, PLOS ONE Guest Editor

Throughout my involvement in the open science movement, I have been pleasantly surprised to find that helping to enable scholars to conduct science in more open and transparent ways can be just as if not more rewarding than conducting original research itself. 

A rationale for this Cognitive Psychology Call for Papers, with its emphasis on transparent reporting and pre-registration, was to help address difficulties in recruitment and planning that are particularly relevant to that field of research. Can you tell us more about it? How do these concerns affect your editorial work more generally?

Transparent communication about the process of scientific research – recruitment, protocol, data analysis – is central to the credibility of science as a field. Unfortunately, many factors make this challenging across subdisciplines within psychology.

With regard to cognitive development, scholars working in this area are often tasked with understanding how processes and abilities change over time and doing so often necessitates responsiveness to the practical demands of samples that inherently change over the course of their involvement in a given research project. Further, measuring cognitive processes is quite challenging and trial and error is often necessary to generate sound, reliable research protocols even in the best of scenarios. This is magnified when such protocols need to be adjusted to the needs of a sample whose abilities are also growing and changing. Thus, it can be very difficult to decide at the outset of large longitudinal studies, for example, every decision that will need to be made along the course of the project and to rigidly adhere to such decisions.

Transparently describing and reporting when decisions regarding research methods and analysis were made—at study outset, during data collection, after data analysis had begun—enables others to better contextualize and understand study findings.

Ben Brown, PLOS ONE Guest Editor

Nevertheless, transparent and complete reporting remains important. Given the challenges I described, some scholars working in this area have been hesitant to adopt preregistration due to concerns that this practice may reduce their ability to be creative, flexible, and responsive to their needs or the needs of their samples. What I am so excited about with regard to preregistration, however, is that I see it as actually enabling those things but doing so in a way that improves the interpretability of research findings as well as the cumulative nature of science. Transparently describing and reporting when decisions regarding research methods and analysis were made—at study outset, during data collection, after data analysis had begun—enables others to better contextualize and understand study findings. Further, preregistration and subsequent documentations of deviations from an original plan helps other scholars working in that area better plan their own research by being able to anticipate and proactively address challenges.

SIPS logo
SIPS

I have had some previous experience editing more transparent submissions at outlets like Collabra: Psychology and find it quite refreshing. Open data and code allow for easy verification of results. Preregistrations, including any modifications, help reviewers contextualize results and consider matters such as researchers’ degrees of freedom. I honestly would find it difficult to go back to a more traditional editorial experience.

How do you think some of the papers in this Collection illustrate good open science practices that can improve rigor and reliability in psychological research? For instance, the Collection includes a Registered Report Protocol on improving the diagnostic accuracy of Alzheimer’s disease, a hotly debated research topic, or another Registered Report Protocol on a user-friendly mobile application to assess inhibitory control (see an interview with the authors of this protocol on the COS blog). What role do you think a more transparent planning and reporting process can play?

I was delighted to see the open, transparent practices exemplified by the articles in this collection. I was particularly encouraged to see the Registered Report examining Alzheimer’s disease within the collection. Like I mentioned previously, I believe that preregistrations are among the best things we can be doing as a field and research area to improve rigor and transparency.

PsyArXiv
PsyArXiv

I was also happy to be able to suggest additional ways in which contributing authors might share their science openly. Namely, I personally suggested that we encourage all submitting authors to share their manuscripts as preprints on PsyArXiv. Sharing manuscripts in this way further ensures that findings are transparently disseminated, even if the work is ultimately less appealing to publishing outlets, such as when studies report null findings or when work is considered less novel. These studies are important components of the scientific record and sharing them openly can contribute to a more complete and cumulative science.

Leave a Reply

Your email address will not be published. Required fields are marked *


Add your ORCID here. (e.g. 0000-0002-7299-680X)

Related Posts
Back to top