Combatting research integrity breaches: insights from Suzanne Farley, Director of the Springer Nature Research Integrity Group
The COVID-19 pandemic has led to a large volume of research moving through the publication pipeline, often at an accelerated rate. Given the widely publicised concerns regarding flawed COVID-19 research, now more than ever, the spotlight is on ensuring best practice and ethical conduct in research and publications. Following her talk at the 2020 European Medical Writers Association (EMWA) symposium, Suzanne Farley, Director of the Springer Nature Research Integrity Group (SNRIG), spoke to The Publication Plan about her experience of research integrity breaches and her group’s work to combat this.
For anyone unfamiliar with the SNRIG, what role does it play in detecting and investigating research integrity problems among submissions or publications at Springer Nature?
“SNRIG is an advisory body serving all Springer Nature content divisions: books, journals, magazines and databases. We advise staff and their external editorial partners on best practice for resolving research integrity and publication ethics problems, in both published and pre-publication content. We also advise policy and operations colleagues on improving quality control during pre-publication manuscript processing workflows.”
What are some of the most common problems identified?
“Plagiarism, authorship disputes, conflicts of interest, image manipulation, data fabrication, substandard peer-review, identity theft, and paper mill submissions.”
How often does the SNRIG investigate potential issues pre- versus post-publication? In general, should more be done by publishers to establish the quality of submissions during peer review, or at other pre-publication stages?
“Based on 2020 data, 65% of the cases investigated by SNRIG concerned published content, with the remainder detected before publication. These numbers show that publishers should indeed be doing more before publication to detect problematic submissions.
Based on 2020 data, 65% of the cases investigated by SNRIG concerned published content… These numbers show that publishers should indeed be doing more before publication to detect problematic submissions.
Springer Nature has initiated a series of projects this year to start shifting this needle. We’ll be piloting new checks – some facilitated by people, some by software – with the aim of detecting the issues that most frequently underlie research integrity and publication ethics cases more efficiently. Think image manipulation, paper mills, peer review manipulation, unethical research and irreproducibility. Wherever possible, we’ll use artificial intelligence (AI) to ensure the checks are fast, consistent and scalable, being always mindful of striking the right balance between what a machine can do and where expert human oversight is required.”
Have you noticed any changes in the type or volume of SNRIG’s workload during the COVID-19 pandemic?
“There was an overall increase in submissions to journals in 2020; submissions directly related to COVID-19 accounted for some of the spike. So, SNRIG’s workload, like that of everyone else involved in the publication workflow, increased proportionally. We noticed that authors of COVID-19-related papers were more frequently submitting their manuscripts to multiple journals at the same time, and they were more likely to miscategorise their original research articles as ‘Letters to the Editor’ or the like, perhaps in an effort to circumvent thorough peer review.
Authors of COVID-19-related papers were more frequently submitting their manuscripts to multiple journals at the same time, and they were more likely to miscategorise their original research articles…perhaps in an effort to circumvent thorough peer review.
It’s impossible to know what motivates individual authors to make these choices, or to distinguish unequivocally between honest error and deliberate misconduct. That said, getting their manuscripts published as fast as possible is a safe guess as to motivation. But circumventing peer review and overloading peer reviewers and editors, by submitting the same manuscript to multiple journals at the same time, are obviously not OK. Springer Nature increased the stringency of existing checks early during the pandemic to crack down on this behaviour.”
Circumventing peer review and overloading peer reviewers and editors, by submitting the same manuscript to multiple journals at the same time, are obviously not OK.
It’s often stated that science ‘self-corrects’ and that retraction of flawed research is one of the ways in which this occurs. According to Retraction Watch, there was a 40-fold increase in the annual number of retractions between 2000 and 2020. Why do you think there’s been such a huge increase in the past 20 years, and how do you think that the open science movement will impact retraction rates in the future?
“Science does indeed self-correct, but it does so too slowly. There are no reliable baseline data, but my opinion is that we’re mostly playing catch-up. That is, there has not been a 40-fold increase in the frequency of research integrity issues in the past 20 years. Rather, there has been a combination of a somewhat increased frequency (as technology that facilitates image manipulation has evolved, and third parties like paper mills have emerged, for example) plus a steep increase in awareness in academic communities about research integrity. People are now much more likely to report issues to publishers. Indeed, there’s a growing community of incredibly committed whistleblowers who dedicate themselves to trawling the historical literature for errors. All of this adds up to a steep increase in the number of retractions. This increase will persist as we continue to clean up existing problematic publications, while working on preventing more problematic content from being published in the first place.
Open research and publication practices should have a positive impact. If practices like pre-printing and making raw data and peer review history publicly available become the norm, it will be more difficult to conceal deliberate misconduct and easier to detect honest errors. Open research practices are definitely the way forward.”
If practices like pre-printing and making raw data and peer review history publicly available become the norm, it will be more difficult to conceal deliberate misconduct and easier to detect honest errors. Open research practices are definitely the way forward.
According to a 2012 analysis of retracted biomedical and life science research articles, there’s been a 10-fold increase in retractions due to scientific fraud since 1975. You’ve previously noted that the SNRIG adopts a non-accusatory approach, adhering to the principle of innocent until proven guilty, with the aim of correcting the scholarly record rather than administering punishment. Do you think that this approach allows you to tackle and resolve more issues, more quickly than historically slow retraction processes? Given the rise in scientific fraud, do you feel that the SNRIG’s approach does enough to discourage serial offenders?
“Yes, SNRIG’s approach allows us to more thoroughly investigate and therefore resolve more issues. But it is frustratingly slow. I often use the analogy of SNRIG being like a police force but without the power to exercise a search warrant. We rely on cordial relationships with authors, peer reviewers, editors and all other stakeholders to facilitate our investigations. Without their cooperation, we often cannot access essential investigation materials (eg raw data, out-of-system correspondence), so cannot reach unequivocal (and legally defensible) conclusions and take definitive actions.
In the past 3 years, SNRIG has taken a much more ‘Springer Nature-wide’ approach to case work. For example, if we suspect that a problem on a single paper may be part of a wider network of misconduct (eg a paper mill), we will check all Springer Nature journals (~3,000) and books for the author names, peer reviewer names, email addresses, and manuscript title. This cross-portfolio functionality required us to build a new database; previously, it was not possible to search across journals and books in this way. We also inform institutions and funders much more frequently about suspected misconduct. This approach deters serial offenders from submitting to Springer Nature.
What we need to do next is take a cross-publisher approach. There’s no point deterring serial offenders from submitting to Springer Nature if they can simply submit to another publisher.
What we need to do next is take a cross-publisher approach. There’s no point deterring serial offenders from submitting to Springer Nature if they can simply submit to another publisher. It’s great to see organisations like the Committee on Publication Ethics (COPE) and the STM Association starting to facilitate these cross-publisher interactions.”
When retraction occurs, vague retraction notices can hinder readers’ understanding of exactly what has happened. How do you think publishers should approach Expressions of Concern or retraction notices?
“There has been an improvement in the amount and consistency of information provided in post-publication notices, but the improvements have not been uniform across the sector. Even publishers that have improved still do not provide a huge amount of information in retraction notices or Editorial Expressions of Concern. One factor preventing further progress is the ever-increasing amount of litigation involved in research integrity investigations. It is unfortunately very common for authors and other stakeholders to threaten legal action, which means only information that is 100% verifiable can be made public in post-publication notices.”
Readers may be shocked to know the extent of some research integrity cases, for example those involving paper mills. In your experience, how serious is the problem and how can publishers cooperate to tackle it?
“Paper mills are one of the few things that keep me awake at night. They are pervasive and devious, continuously adapting their processes to circumvent detection. It’s a real cat-and-mouse game. Their papers pollute the literature, waste huge amounts of editor and peer reviewer time, and negatively impact the work of honest researchers.
Paper mills are one of the few things that keep me awake at night. They are pervasive and devious, continuously adapting their processes to circumvent detection.
At Springer Nature we have been working with data scientists to map networks of authors, peer reviewers, email addresses and manuscript titles associated with paper mills. The maps will be used to flag potential paper mill submissions as they hit our systems. We’re also training our staff and editors to manually spot the features of paper mill submissions. And we’re sharing everything we learn with other publishers, via COPE and the STM Association. I hope we’ll soon get to a point where publishers can share identifying information about suspected paper mill submissions, but there are some major legal barriers, including those related to the EU General Data Protection Regulation (GDPR), that need to be overcome first.”
Those involved in research misconduct are likely to implement increasingly sophisticated methods. In the future, how do you think advancing technology will help detect potential problems with articles?
“There are some really impressive tools being developed by an ever-increasing number of extremely talented computer scientists (the recent inaugural Computational Research Integrity conference showcased a selection). Many of these tools are at an early stage of development, but I’m hopeful that we’ll be able to integrate the front-runners into publishers’ quality check workflows within the next few years. Then it’s a matter of staying one step ahead of those intent on research misconduct, ensuring our tools are better than theirs.”
Given the considerable resources required to investigate research integrity issues, and the low priority this is given by some publishers, do you think a regulatory body for publishers would help to improve the quality of published research?
“I do. There is a lot of effort being expended by reputable publishers to set and enforce common standards. Most are growing their research integrity groups and investing in improved quality control. But, in the absence of a regulator, the pace of improvement lags behind what’s needed to clean up the literature, prevent the spread of misinformation and safeguard the public’s trust in science. And less reputable, predatory publishers are flourishing in the absence of regulation. If there are no sanctions against bad actors, it will be difficult for the good guys to come out on top.”
In the absence of a regulator, the pace of improvement lags behind what’s needed to clean up the literature, prevent the spread of misinformation and safeguard the public’s trust in science.
Finally, you have mentioned that collaboration between stakeholders is needed to improve research integrity. What can all stakeholders involved in scientific research do to help combat research integrity issues?
“There’s no way that question can be answered succinctly, I’m afraid! There are so many different stakeholders, in so many different locations, with competing priorities and a distinct hierarchy of problems to fix within a unique framework of local conditions. That said, the first step to addressing any of it is to talk about it. Let’s talk openly about the problems we see, how widespread they are, and how ill-equipped we currently are to prevent them.
Let’s talk openly about the problems we see, how widespread they are, and how ill-equipped we currently are to prevent them.
Only when everyone involved in the research enterprise understands the extent of the problems will we get sufficient buy-in to address them in a meaningful way.”
Suzanne Farley is Director of the Springer Nature Research Integrity Group.
——————————————————–
With thanks to our sponsor, Aspire Scientific Ltd
Categories
Interview, Open access / open source, Peer review, Predatory publishing, Reproducibility, Research integrity, Retraction, Transparency