newsweekshowcase.com

Retracts data can be used to clean up science

Nature: https://www.nature.com/articles/d41586-025-00509-1

Nature’s News Showcases the Rise of Institutional Retraction Rates: Jining First People’s Hospital as the World’s Most-Retracted Institution

This surge can now be seen in a first-of-its-kind analysis of institutional retraction rates around the globe over the past decade, for which Nature’s news team used figures supplied by three private research-integrity and analytics firms. Jining First People’s Hospital tops the charts, with more than 5% of its total output from 2014 to 2024 retracted — more than 100 papers (see ‘Highest retraction rates’). The proportion is higher than China and the global average. The hospital could be the world’s most-retracted institution depending on how one counts.

If you know about the rates of retraction, you can prompt institutions to change their incentives. It is possible that institutions will look at retraction rates, instead of counting just articles and citations. The metrics could be taken into account by funders.

A different way to view the data is to pick out institutions with the largest overall numbers of retractions. The top three universities are in Saudi Arabia, but most are from China. The institutions with large numbers of papers tend to have lower retraction rates. Jilin University, a Chinese university, has some analysts count their hospital retractions along with the ones from the rest of the world.

Some people theorize that authors from Ethiopia and other African countries who publish relatively few articles might have taken advantage of the free open-access fee for scholars from low- or lower-middle-income countries in order to bring their work back to their home countries. The owner of the paper mill that uses the scheme has strengthened its internal checks according to a spokesman for the publisher.

Many universities and research institutions around the world prize high productivity. They encourage researchers to publish more articles and accrue more citations; higher counts can signal that an institution’s research is impactful and push its international ranking up. Most of the published articles and citations represent reliable contributions to the scientific record. In some cases, the push for more, faster research comes at a price: it can encourage sloppy science, plagiarism and research fabrication.

Implications of Retraction Data for Publishing in Institutions: Evidence from India, Saudi Arabia and the United States, UK, and other countries

But this is surpassed by retraction rates in Ethiopia and Saudi Arabia, and exceeded or rivalled (depending on the data set) by those in Iraq and Pakistan. Russia beats Iraq in theRetraction Watch data because many Russian retractions do not have DOIs, and the firms omitted them from their analyses. Meanwhile, countries such as the United States and the United Kingdom have rates of around 0.04%, much lower than the global average of 0.1%, and many countries have even lower rates (see ‘Retraction rates by country’).

Analysing institutions is an even thornier task, because there are both data errors and different approaches in the underlying databases, Nature’s analysis found. To map institutional affiliations, Dimensions uses a private Global Research Identifier Database (GRID), whereas OpenAlex uses the public Research Organization Registry (ROR). The database curators might have made different decisions about how to assign an affiliation due to a particular issue for some smaller institutes. The analyses vary from firm to firm.

Researchers in public universities and government institutes in India face fewer pressures to publish than do those in private universities and colleges, says Agrawal. Private institutions, he says, push students and researchers to publish many articles, and in some cases pay bonuses for papers published.

The three firms’ numbers of retracted articles are some 6–15% greater (over the past decade) than the Retraction Watch data set they build on. But in some cases, articles can be recorded erroneously in CrossRef and other online sources1, so the data might not be 100% accurate, cautions Jodi Schneider, an information-sciences researcher at the University of Illinois Urbana-Champaign who studies retractions.

Potential red flags in research articles or submitted manuscripts are the focus of their tools. To build them, the firms have created internal data sets of retracted papers and their affiliations. The information in these is based on the database launched by Retraction Watch, which was acquired by Crossref in 2023, for public distribution. This made it easier for others to use and analyse the information.

One firm, Scitility, says it will make its institution-level figures public later this year. “We think scientific publishing will be helped if we create transparency around this,” says the firm’s co-founder Jan-Erik de Boer, a former chief information officer at Springer Nature who is based in Roosendaal, the Netherlands.

Ivan Oransky is a co- founder of the website Retraction Watch, and he says it may be tempting to look at the incentives of researchers in different institutions.

The problem is that some physicians at hospitals have purchased fake manuscripts from paper mills that make fraudulent scientific reports. These doctors were under pressure because they were required to publish papers to get jobs or earn promotions, says integrity sleuth Elisabeth Bik in California. Sleuths such as Bik soon began spotting signs of this problem, identifying duplicated images in large numbers of papers. They publicized the issue and a wave of retractions followed.

Publishers did agree under the auspices of the US National Information Standards Organization last year on a common technical standard for recording and communicating retractions, which might help to improve matters.

Records of work that was not completed are messy as well. One heavily cited paper in The Lancet is being categorized as “retracted” in CrossRTp, a site that records data about published articles when it has not been corrected. The paper was printed by a publisher who did not reply to a request for comment.

Still, the new mountain of retractions data is a valuable tool that should not be ignored. Quality, as well as quantity, are important in science.

Exit mobile version