Features

Toward Open Science: Contributing to Research Culture Change

Download Article

Advancing open science across the complex and decentralized research ecosystem remains challenging to fully enact for a variety of reasons. Specific open science initiatives, including journal policy changes, preregistration, and Registered Reports represent existing opportunities for publishers, societies, institutions, funders, and researchers to contribute to more coordinated culture change across research communities.

 

In a November 2021 post1 in The Scholarly Kitchen, Roger Schonfeld suggested that the current model for scientific scholarly communication may be ill-suited to improve and sustain public confidence and trust in science as it moves toward greater openness and transparency. The increasing politicization of science,2–4 along with related challenges in effectively managing the public communication of science, intensifies the need to build and sustain trust in the scientific process. After outlining proposed priorities for the scholarly communications community to contribute to trust-building in science, Schonfeld ultimately points to the need for greater coordination and collaboration across stakeholders in the global knowledge system—publishers, senior research officers, policy makers, institutions, funders, and libraries—to sustain a trusted information environment.

System-level coordination and collaboration is a convergent theme across the open science reform movement.5–7 The recent adoption8 of UNESCO’s Recommendation of Open Science9 by all 193 member states offers a new signal of intentionality and a normative framework for global system coordination. The COVID-19 pandemic spurred coordination and implementation of open initiatives among various stakeholders across the scholarly communications system who aligned to thwart a global crisis. Specifically, the statement issued in January 2020 by the Wellcome Trust on “Sharing research data and findings relevant to the novel coronavirus (COVID-19) outbreak10 was signed by 160 organizations worldwide, including research funders, publishers, infrastructure providers, and research institutions. In April 2020, a group of publishers and related organizations launched the COVID-19 Rapid Review Initiative11 to maximize the efficiency and speed of peer review of COVID-19 research.

In June 2021, the Research on Research Institute12 published a report13 of their study evaluating whether these coordinated initiatives did, in fact, change the scholarly communications system by accelerating open access, preprinting and related peer review of preprints, data-sharing, and publication times of COVID-19 research. They found that COVID-related research was made more open and freely accessible, and that preprinting increased. However, they also found that little had changed in the way of sharing data related to COVID-19 research and, furthermore, that efforts to peer review preprints remained low in proportion to the research output.13 The contributors reinforced system coordination and collaboration by explicitly recommending that “all stakeholders in the research system should recognize that improving scholarly communication is a joint responsibility that requires collaboration and coordinated action across stakeholders, including the development of policies with accompanying monitoring and accountability mechanisms.”13,p75

Yet, system coordination to advance open science, even for the dissemination and verification of the scientific record, let alone for the full research lifecycle, remains challenging to fully enact for a variety of reasons. The research ecosystem is decentralized with socially constructed community norms, so widespread adoption of behavior change is complicated. Furthermore, institutions, scholarly societies, publishers, and researchers themselves often have limited resources and core objectives they must deliver on to sustain their work, making broader ownership and engagement in the practice of science either unappealing or seemingly untenable. Institutionalized incentive structures are misaligned with the values of openness and transparency, rewarding researchers for being published, sharing novel results, and ultimately, downplaying or ignoring null or negative results over getting it right.14,15 And, systems generally prefer homeostasis, especially when incentive structures are not designed to promote change.

Therefore, when proponents of open science reform urge for system coordination and collaboration, it can be daunting for individual leaders or stakeholder groups—publishers, societies, funders, institutions, and others—to know where to begin, how to contribute, and how to make a difference, especially as one actor in a complex and dynamic system. For that reason, we aim to demonstrate how specific open science initiatives can be part of an effort for more systemic culture change across research communities and stakeholders.  

Theory of Research Culture Change: A Systems-Level Approach

At the Center for Open Science16 (COS), we have developed a theory of research culture change17 in service of open, transparent, and reproducible18 science that employs 5 levels of intervention represented by the pyramid in the Figure. These levels are progressive, reflecting the fact that successful implementation of higher levels depends on successful implementation of lower levels. 

Figure. Center for Open Science theory of change model.

To scale adoption of open behaviors by researchers, COS focuses on 1) providing open infrastructure through the open-source Open Science Framework19 (OSF) that makes it possible to do the behaviors; 2) conducting user-centered product development to make it easy to do the behaviors; 3) supporting grassroots organizing through training and community-building efforts to activate early adopters and make their behavior visible;20 4) offering solutions to journals and publishers, funders, societies, and institutions to nudge their incentives to make it desirable to do the behaviors; and 5) providing and promoting a policy framework for stakeholders to make the behaviors required. Effective policy implementation requires effective infrastructure for practicing the behaviors, and community buy-in to treat the behaviors as good practice rather than administrative burdens. These 5 levels of intervention are highly interdependent, each necessary, and none sufficient on their own.

When behavior change requires culture change, it is essential to consider the structural features of the culture and how they enable and constrain individuals to behave according to their intentions and values. Successful, normative, incentive, and policy interventions require effective infrastructure that provides easy transitions from how they behave today. Likewise, enacting that behavior change requires sensible incentives and policies that align with the behavioral tools available to individuals. For widespread embrace, the changing behavior must be visible to the community to stimulate the diffusion of innovation.

Open Initiatives That Can Support System Change

It is relatively easy to state that systems need to change in order to reform scientific practice. However, such visions require specific, actionable steps that can be supported and implemented. COS points to such specific actions that individual researchers or policymakers at journals, publishers, societies, and funding organizations can take to begin to make this idealized vision a reality. These steps derive from the goal of ensuring that empirical research evidence can be reproduced (verified through checking the collected data and reported findings) and replicated (verified through conducting the reported methods a second time).18 The practices that we focus on to achieve those goals are outlined in the Transparency and Openness Promotion (TOP) Guidelines21,22 and include transparency of underlying data, research materials, analytical code, and study design; citation of research data used in studies; preregistration of study plans, sometimes with a specific analysis plan; and use of policies or workflows that incentivize replication studies, namely Registered Reports. 

Our philosophy comes from the optimization of 2 needs: 1) to meet stakeholders where they are by not pushing to perfection at the expense of any improvement, and 2) to create clear success criteria for ideal results. This optimization is reflected in the tiers provided by the TOP Guidelines in which the first level requires that research outputs disclose whether or not any given open practice occurred (e.g., data sharing, code sharing, or preregistration), the second requires transparency for the standard, and the third verifies that the practice occurred to a high standard (e.g., through computational reproduction). 

Once any given policy covers an open science practice mentioned by a publisher, funder, or individual journal, a suite of tools is available to enable the practice. Below are examples of such tools that we use to promote adoption of data sharing, preregistration, and a publishing format known as Registered Reports. 

Materials generated during the course of a project are all too often lost when curation is left as an afterthought at the end of a study. Protocols, datasets, research instruments, and analytical code end up on individual drives that may walk away within the normal course of turnover in a research lab. Using an online project management space that also enables persistent sharing (when the project is ready to be made public) reduces costs for the lab that is focused on getting results into the published literature. The OSF enables project management and is connected to built-in registries, data repositories, and preprint servers. Furthermore, it connects to versioning platforms such as GitHub and large online data storage providers such as Dropbox to enable curation. It also offers long-term preservation in partnership with Internet Archive.23

Preregistration is a process by which a researcher asserts that a study is about to occur and includes the main research questions and processes by which the study will be conducted. By submitting such declarations to a public (perhaps after an embargo period), searchable registry, consumers of scientific knowledge can better understand how much research is conducted in a field and can open the proverbial “file drawer” of conducted, but not necessarily published, research.24 When the preregistration also includes a specific analysis plan, it can address some poor research practices such as selective reporting or cherry picking of an unrepresentative dissemination of research findings.25

Preregistration, along with easy to use study registries, enables better research practices. Registered Reports26 (RR) is a publishing format that incentivizes this process. When a journal offers RRs, it commits to reviewing proposed studies (i.e., the preregistration) for possible publishing.27,28 If a journal reviews and provisionally accepts the proposed study, it commits to publishing the final results regardless of the main outcomes of the study. Preliminary evidence finds that RRs are working as intended, by reducing publication bias,29 increasing rigor of reported findings,30 and still being cited as often as standard-format papers.

Over 300 journals offer RRs as a publishing offer, but several funders also engage with the format by funding research that has been given an in-principle acceptance for publication.31 This eases outcome reporting enforcement because it is tied directly to a publication, and there is a strong, existing incentive to publish in the research community. Importantly, this coordination between journals and funders creates a broader system that is promoting culture change in the academic research community and is central to our systems-level approach to interdependent forces.

The examples above highlight the simple fact that new expectations in any community can turn into new norms only if they are rewarded, verifiable, and used. Furthermore, the decentralized nature of science requires coordination between researchers, institutions, funders, and publishers of scientific knowledge in order to make meaningful progress toward shared goals. 

Lessons Learned in Coordinating the System

Coordinating system change is difficult, especially when incentives are not aligned with the desired normative behavior. Coupled with variations of community and disciplinary terminology, among other challenges, changing research culture can seem insurmountable at times. Navigating these complexities requires an agile and experimental approach. Pilot studies enable exploration of ideas before implementation, metascience (or science of science) research provides a mechanism to study intended and unintended consequences of change, and open communication and feedback allows systems to adapt early to enable eventual policy approaches to be aligned with the desired practices. Community engagement is critical to enabling reform movements to gain any traction, and coordination and participation across stakeholder groups can create a mechanism for continuous improvement and acceleration of change. A key consideration is constantly considering the users’ workflows and experience so that behaviors can be easier and more efficiently implemented rather than being perceived as a bureaucratic hoop to jump through. Finally, training and education are important to sustain and increase adoption of change. Systems prefer homeostasis, and it is easy to default to prior behavior, even when we know it is a behavior we want to change. Simply telling researchers to implement open science practices is insufficient.

Let us consider RRs. As mentioned above, RRs continue to grow in their adoption since first being implemented. There are community efforts to grow this adoption32 and innovations to combine RRs with funding and regulatory review.33,34 This success did not occur in isolation by a single stakeholder or without adaptation. For example, early evaluation efforts highlighted challenges in the implementation of RRs, such as lack of protocol transparency.35 These challenges led to opportunities to improve and align the process, specifically leveraging infrastructure to enable users to easily deposit their protocols, under embargo if necessary, so the accepted stage 1 protocol is openly available to interested readers.36 For this open science reform initiative to continue to grow and advance, stakeholders had to adapt it. There are many future possibilities for RRs, most of which require continued coordination across stakeholders.

Time to Scale and Sustain the Change

Sustained research culture change will come when, together, we move past early adopters of open science practices to several agents within the system coordinating and supporting change. Specific open science initiatives that are now gaining traction and greater support across funders, publishers, societies, and institutions can be part of the more systemic culture change effort across research communities and stakeholders. Even small steps to pilot these initiatives within communities can garner needed insights to minimize friction at the outset and maximize outcomes and scalability over time. Greater coordination of these initiatives across all stakeholders can enable a bigger return on investment and minimize the burdens that are inherent to change. 

References and Links

  1. Schonfeld RC. Is scientific communication tit for purpose? Scholarly Kitchen. 2021. https://scholarlykitchen.sspnet.org/2021/11/01/is-scientific-communication-fit-for-purpose/. 
  2. Bolsen T, Druckman JN. Counteracting the politicization of science. J Commun. 2015;65:745–769. https://doi.org/10.1111/jcom.1217.
  3. Chinn S, Sol Hart P, Soroka S. Politicization and polarization in climate change news content, 1985–2017. Sci Commun. 2020;42:112–129. https://doi.org/10.1177/1075547019900290.
  4. Gauchat G. Politicization of science in the public sphere: a study of public trust in the United States, 1974 to 2010. Am Sociol Rev. 2012;77:167–187. https://doi.org/10.1177/0003122412438225.
  5. National Academies of Sciences, Engineering, and Medicine. Open science by design: realizing a vision for 21st century research. Washington, DC: The National Academies Press; 2018. https://doi.org/10.17226/25116.
  6. Organisation for Economic Co-operation and Development. Making open science a reality. OECD Science, Technology and Industry Policy Papers, No. 25. Paris: OECD Publishing; 2015. https://doi.org/10.1787/5jrs2f963zs1-en.
  7. Robson SG, Baum MA, Beaudry JL, et al. Promoting open science: a holistic approach to changing behaviour. Collabra: Psychol. 2021;7:30137.  https://doi.org/10.1525/collabra.30137.
  8. UNESCO. UNESCO sets ambitious international standards for open science. 2021. https://www.unesco.org/en/articles/unesco-sets-ambitious-international-standards-open-science?hub=686. 
  9. UNESCO. Draft recommendation on open science. 2021. https://unesdoc.unesco.org/ark:/48223/pf0000378841. 
  10. https://wellcome.org/what-we-do/our-work/sharing-research-data-improve-public-health-full-joint-statement-funders-health
  11. https://oaspa.org/covid-19-rapid-review-collaboration-initiative/
  12. https://researchonresearch.org/
  13. Waltman L, Pinfield S, Rzayeva N, et al. Scholarly communication in times of crisis: the response of the scholarly communication system to the COVID-19 pandemic. Research on Research Institute Report. 2021. https://doi.org/10.6084/m9.figshare.17125394.v1.
  14. Giner-Sorolla R. Science or art? How aesthetic standards grease the way through the publication bottleneck but undermine science. Persp Psychol Sci. 2012;7:562–571. https://doi.org/10.1177/1745691612457576.
  15. Nosek BA, Spies JR, Motyl M. Scientific utopia: II. Restructuring incentives and practices to promote truth over publishability. Persp Psychol Sci. 2012;7:615–631. https://doi.org/10.1177/1745691612459058.
  16. https://www.cos.io/
  17. https://www.cos.io/blog/strategy-for-culture-change
  18. National Academies of Sciences, Engineering, and Medicine. Reproducibility and replicability in science. Washington, DC: The National Academies Press; 2019. https://doi.org/10.17226/25303.  
  19. https://osf.io/
  20. https://www.cos.io/initiatives/badges
  21. https://www.cos.io/initiatives/top-guidelines
  22. Nosek BA, Alter G, Banks GC, et al. Transparency and openness promotion (TOP) Guidelines. 2016. https://doi.org/10.1126/science.aab2374.  
  23. https://archive.org/
  24. Franco A, Malhotra N, Simonovits G. Publication bias in the social sciences: unlocking the file drawer. Science. 2014;345:1502–1505. https://doi.org/10.1126/science.1255484.
  25. Nosek BA, Ebersole CR, DeHaven AC, et al. The preregistration revolution. Proc Nat Acad Sci. 2018;115:2600–2606. https://doi.org/10.1073/pnas.1708274114.
  26. https://www.cos.io/initiatives/registered-reports
  27. Nosek BA, Lakens D. Registered reports: a method to increase the credibility of published results. Soc Psychol. 2014;45:137–141. https://doi.org/10.1027/1864-9335/a000192.
  28. Chambers CD, Feredoes E, Muthukumaraswamy SD, et al. Instead of “playing the game” it is time to change the rules: registered reports at AIMS Neuroscience and beyond. AIMS Neurosci. 2014;1:4–17. https://doi.org/10.3934/Neuroscience.2014.1.4
  29. Scheel AM, Schijen MRMJ, Lakens D. An excess of positive results: comparing the standard psychology literature with registered reports. Adv Method Pract Psychol Sci. 2021. https://doi.org/10.1177/25152459211007467.
  30. Soderberg CK, Errington TM, Schiavone SR, et al. Initial evidence of research quality of registered reports compared with the standard publishing model. Nat Hum Behav. 2021;5:990–997. https://doi.org/10.1038/s41562-021-01142-4.
  31. https://cos.io/top-funders
  32. https://freeourknowledge.org/2021-07-01-registered-reports-now_ecol-evol-biol/
  33. Chambers CD, Tzavella L. The past, present and future of registered reports. Nat Hum Behav. 2022;6:29–42. https://doi.org/10.1038/s41562-021-01193-7.
  34. Naudet F, Siebert M, Boussageon R, et al. An open science pathway for drug marketing authorization—registered drug approval. PLoS Med. 2021;18:e1003726. https://doi.org/10.1371/journal.pmed.1003726.
  35. Hardwicke TE, Ioannidis JPA. Mapping the universe of registered reports. Nat Hum Behav. 2018;2:793–796. https://doi.org/10.1038/s41562-018-0444-y.
  36. https://osf.io/rr/

 

Lisa Cuevas Shaw, Timothy M. Errington, and David Thomas Mellor, Center for Open Science, Charlottesville, VA

Opinions expressed are those of the authors and do not necessarily reflect the opinions or policies of the Council of Science Editors or the Editorial Board of Science Editor.