Features

How Journals and Publishers Can Help to Reform Research Assessment

Download Article

Journals and publishers recognize that editorial decisions can make or break researchers’ careers. It is well established that administrators and decision-makers use journal prestige and impact factors as a shortcut to assess the research of job applicants, current academic staff, and even proactively recruit academics who score highly on such metrics. It is not uncommon to find language in university evaluation policies that reference or explicitly mention the Journal Impact Factor (JIF). For example, a recent study found that the JIF or other closely related terms, including “high-impact journal” and “journal impact,” were mentioned in 23% of review, promotion, and tenure documents in a representative sample of academic institutions across the United States and Canada.1 This amount increased to 40% among research-intensive universities. However, such an approach to research evaluation provides a limited view of anyone’s accomplishments. Many groups also have argued that focusing on journal brands intensifies competition between researchers and journals in ways that distort behavior and undermine a healthy and productive scholarly enterprise.­2,3

But it is not enough to recognize the problem. Identifying specific approaches that publishers can take to address these concerns really is key. The Declaration on Research Assessment (DORA)4 is doing that by advancing practical and robust approaches to improve how research is evaluated in hiring, promotion, and funding decisions. But change—which is essentially cultural—does not come easy. It hinges on the actions of individuals, organizations, and every stakeholder in the environment.  When DORA was released in 2013, the declaration provided 18 targeted recommendations to publishers, research institutes, funders, metrics providers, and researchers. Five of the recommendations were written for publishers, and the purpose of this article is to highlight some practical steps that publishers can take in support of more effective research assessment.

From Journal Metrics to Article Merits

A central idea in DORA is to shift emphasis from journal-based assessment to a much broader view of scholarly contributions that takes into account individual articles and other research outputs as well as contributions in teaching, mentorship, and public engagement. As a first step, and to signal a lack of the support for the journal impact factor, some publishers have abandoned promotion of the journal impact factor altogether, as has been done by the American Society for Microbiology, eLife, and PLOS.6–7  Other publishers, such as EMBO, Nature Research, and the Royal Society,8–10 have instead put the journal impact factor in the context of a broad range of journal metrics, which helps to show that different journal metrics have different values. These and other publishers have also added a graph to show the breadth of the citation distribution which is common to all journals and demonstrates that an impact factor is a poor predictor of the likely number of citations that any paper will receive.11

To support the shift towards the evaluation of individual articles (and other outputs), services have been developed that provide article-level metrics and indicators. Altmetric and ImpactStory12,13 gather metrics from a variety of sources including Twitter, Facebook, Wikipedia, news outlets, and blogs to provide a sense of the attention received by an individual article, beyond citations. Importantly, these and other tools allow qualitative information to be gathered as well as quantitative information, such as who is commenting about an article and what types of opinions are being expressed. Publishers can support these approaches by providing article-level data themselves, including information about usage and citations.

Another initiative that can be supported by publishers is CRediT,14 which provides a standardized taxonomy of author contributions. Many major publishers have adopted this taxonomy, which helps to identify the specific contributions that any author has made to a study. With greater adoption by journals, authors can compile their contributions across studies. Coupled with the use of article-level metrics and indicators, it is therefore possible for a researcher to build a data-driven picture of the influence of their work, which extends beyond traditional “authorship.” However, as with the use of any metrics, care must be taken in the presentation and interpretation of such data.15 The Metrics Toolkit16 can help individuals better understand what information different metrics can and cannot provide.

Beyond Articles

Increasingly, publishers are supporting the recognition of research outputs beyond peer-reviewed articles. One important step being taken is to encourage best practices in the citation of outputs such as data, code, protocols, and other resources. Initiatives such as the Joint Declaration on Data Citation Principles has an associated set of recommendations17 that all journals can follow. A related initiative has been created to generate unique identifiers for research resources (RRID).18 By encouraging the use of such identifiers and practices, metrics can be gathered about the usage and value of all research outputs, which can feed into a more holistic approach to the assessment of an individual, group, or university’s research.

On the other side of the coin, citing research outputs is not useful unless they are available to others.  Journals should therefore require authors to make all of the core data and resources that underpin a piece of published work to be made available as openly as possible, according to the FAIR (findability, accessibility, interoperability, and reuse) principles,19 to allow other interested researchers to build on the work. The authors will benefit from this approach because their resources and findings are more likely to be used and cited by others: information that could bolster applications for jobs and funding.20

Finally, another under-recognized aspect of scholarly activity is peer review. The insight and advice that researchers routinely provide to their colleagues receives little if any recognition. Therefore, another valuable step that publishers can take is to ensure reviewers get credit for reviews and, if reviewer and author agree, publish the peer review reports (with or without the name of the reviewer). There is a growing list of journals that are either already publishing reports or are committed to doing so.21 To take this a stage further publishers can integrate with services such as ORCID or Publons22,23 to add peer review activity to a researcher’s profile and help them to gain recognition for this scholarly contribution. Researchers can use this information as evidence to demonstrate their service during evaluations.

Mighty Metadata

Richer and more effective research assessment will be supported by a robust network of connections between people and all of their research outputs and contributions. A crucial component of such a network is high-quality and open metadata. Publishers are the providers of a huge amount of metadata, made available through a number of services, especially Crossref. Several initiatives have been introduced in recent years to increase the value of publishing metadata and to strengthen the network of scholarship, most notably the Metadata 2020 project.24 Publishers have been at the forefront of many of these developments and are continuing to play an important role in their adoption. Nevertheless, there is still a lot of variability in the quality of metadata, and improvements can be made.

Many publishers now require authors to provide ORCIDs for one or more authors, which will help with the creation of more complete and useful ORCID profiles.25 Another important development is the Initiative for Open Citations,26 which was launched in 2017 to encourage publishers to make their reference list metadata open. Most publishers deposit this metadata with Crossref but access is restricted by default. To make the data open publishers need to send an email to Crossref. Since I4OC was launched, more than half of the data is now openly available. However, many publishers are still unnecessarily restricting access, which is limiting its value for new uses and services.27 Reference data can be used for many purposes, but given its relevance to research evaluation, fully open data will also help to support further experimentation and greater transparency in evaluation practices.

Advocacy

Whatever actions are taken by publishers and journals to encourage the reform of research evaluation, it is also valuable to provide context for these initiatives. Editorials, blog posts, and other articles can all be used to explain the position that a particular journal is taking. Publishers can also help to advocate for reform among the other stakeholders, especially researchers, funders, and institutions.

Scholarly meetings, especially for societies, are another place to bring people together for conversations about innovation in research assessment. Journals associated with societies are in a great position to do this. DORA itself originated from a group of journal editors and publishers who met at the American Society for Cell Biology (ASCB) meeting in San Francisco in 2012. More recently, DORA hosted a capacity building session at the 2018 ASCB|EMBO Meeting, where participants provided feedback on application materials for grant funding and faculty positions. During the exercise, participants identified shortcuts that assessors could take when reviewing applications. To help uncouple individual articles from a publisher’s brand, one idea was to remove journal names from bibliographies and ask applicants to provide a 2–4 sentence summary describing the significance of the work.28

Looking Inward

In addition to taking action to encourage more effective and fairer research assessment by other organizations, publishers should also examine their own processes. Participation in the scientific publishing process as editors, reviewers, and authors contributes to researchers’ professional success. Journals therefore have an obligation to promote equity, diversity, and inclusion at each step of the process. Some gender imbalances are easy to recognize, like the relative number of female editors and peer reviewers.29 Others, however, are less apparent. For example, one study revealed the gender inequalities among co-first authors on research articles suggesting that female authors do not always receive the credit they deserve.30 One way that journals can decrease such disparities is by ensuring  that editorial boards and peer reviewers reflect the diversity of the scientific community, which might also help to reduce bias in the editorial process.31

Why Take Action?

The fundamental purpose of journals and publishers is to support the communication and conduct of scholarship. As things stand, there is concern that the ways that journals are used for research evaluation is harming scholarship by introducing perverse incentives.32 To counteract these effects will require coordinated action by all of the key stakeholders involved in scholarly communication, and journals and their publishers must play their part. In this perspective, we have described some of the actions that are achievable by most journals and they are summarized in a call to action (Box 1). Journals that adopt these and other approaches will be at the forefront of much-needed reform and will be serving scholarship more effectively.


Box 1 – Call to Action

  1. Cease the promotion of journal impact factors (ref)5
  2. Provide article metrics and indicators (ref)33
  3. Adopt the CRediT taxonomy for author contributions (ref)14
  4. Ensure that all reference data deposited with Crossref is open (ref)26
  5. Require authors to make all key data available according to FAIR principles (ref)19
  6. Follow the data citation principles (ref)17
  7. Encourage the use of unique identifiers (eg RRIDs; ref)18
  8. Require authors to use ORCIDs (ref)25
  9. Publish peer review reports and author responses along with the article (ref)21
  10. Examine ways to increase diversity, equity, and inclusion in the publishing process (ref)31

Acknowledgements

We kindly thank Dominique Babini and Ginny Barbour for reviewing the article and providing helpful feedback.

 References and Links

  1. McKiernan A, Schimanski LA, Nieves CM, Matthias L, Niles MT, Alperin JP. Use of the Journal Impact Factor in academic review, promotion, and tenure evaluations. PeerJ Preprints 2019;7:e27638v2. https://doi.org/10.7287/peerj.preprints.27638v2.
  2. Montgomery J, Nurse P, Thomas DJ, Tildesley D, Tooke J. The findings of a series of engagement activities exploring the culture of scientific research. London, UK: Nuffield Council on Bioethics. 2014. Available at: http://nuffieldbioethics.org/project/research-culture/the-findings.
  3. Working Group on Rewards under Open Science. Evaluation of research careers fully acknowledging open science practices. Brussels, Belgium: European Commission. 2017. Available at: https://ec.europa.eu/research/openscience/pdf/os_rewards_wgreport_final.pdf#view=fit&pagemode=none.
  4. https://sfdora.org/
  5. Casadevall A, Bertuzzi S, Buchmeier MJ, et al. ASM journals eliminate impact factor information from journal websites. mSphere 2016;1:e00184-16. https://doi.org/10.1128/mSphere.00184-16.
  6. Schekman R, Patterson M. Science policy: reforming research assessment. eLife 2013;2:e00855. https://doi.org/10.7554/eLife.00855.
  7. https://www.plos.org/dora
  8. http://emboj.embopress.org/about#bibliometrics
  9. https://www.nature.com/npg_/company_info/journal_metrics.html
  10. https://royalsocietypublishing.org/rstb/citation-metrics
  11. Larivière V, Kiermer V, MacCallum, et al. A simple proposal for the publication of journal citation distributions. 2016. https://doi.org/10.1101/062109.
  12. https://www.altmetric.com/
  13. http://impactstory.org/
  14. https://www.casrai.org/credit.html
  15. Wilsdon J, Allen L, Belfiore E, et al. The metric tide: report of the independent review of the role of metrics in research assessment and management. Stoke Gifford, UK: Higher Education Funding Council for England. 2015. https://doi.org/10.13140/RG.2.1.4929.1363.
  16. http://www.metrics-toolkit.org/
  17. Cousijn H, Kenall A, Ganley E, et al. A data citation roadmap for scientific publishers. Sci Data 2018. https://doi.org/10.1038/sdata.2018.259
  18. https://scicrunch.org/resources
  19. https://www.go-fair.org/fair-principles/
  20. Piwowar HA, Vision TJ. Data reuse and the open data citation. PeerJ 2013;1:e175. https://doi.org/10.7717/peerj.175.
  21. https://asapbio.org/letter
  22. https://members.orcid.org/api/workflow/peer-review
  23. https://publons.com/about/home/
  24. http://www.metadata2020.org/
  25. https://orcid.org/content/requiring-orcid-publication-workflows-open-letter
  26. https://i4oc.org/
  27. Taraborelli D. The citation graph is one of humankind’s most important intellectual achievements. Boing Boing 14 April 2018. https://boingboing.net/2018/04/14/open-graphs.html.
  28. Hatch A, Kiermer V, Pulverer B, Shugart E, Curry S. Research assessment: reducing bias in the evaluation of researchers. eLife 2019. https://elifesciences.org/inside-elife/1fd1018c/research-assessment-reducing-bias-in-the-evaluation-of-researchers.
  29. Helmer M, Schottdorf M, Neef A, Battaglia D. Research: gender bias in scholarly peer review. eLife 2017;6:e21718. https://doi.org/10.7554/eLife.21718.
  30. Broderick N, Casadevall A. Meta-research: gender inequalities among authors who contributed equally. eLife 2019;8:e36399. https://doi.org/10.7554/eLife.36399.
  31. Murray D, Siler K, Larivière V, et al. Gener and international diversity improves equity in peer review. 2019. https://doi.org/10.1101/400515.
  32. Curry S. Let’s move beyond the rhetoric: it’s time to change how we judge research. Nature 2018;554:147. https://doi.org/10.1038/d41586-018-0642-w.
  33. https://www.plos.org/article-level-metrics

Anna Hatch is the DORA Community Manager. Mark Patterson is the Executive Director of eLife and serves on the Steering Group of DORA and the Board of Directors of Crossref.