Scientific fraud: The case of the Spanish university rector should prompt change to ranking system

The founders of Retraction Watch, an organization specialized in research misconduct, call for eliminating the incentives to manipulate metrics

Professor Juan Manuel Corchado, photographed on April 12, before being elected the rector of the University of Salamanca.J. M. García (EFE)
Adam Marcus Ivan Oransky

From afar, the inauguration of Juan Manuel Corchado as the rector of the University of Salamanca earlier this year probably seemed a natural and well-deserved career capstone for the prominent academic. After all, Corchado, a highly prolific computer scientist, is among the most-cited researchers in Spain — a mark of the high regard in which his peers hold his work.

But as EL PAÍS has been reporting for months, Corchado’s impressive reputation as a scholar may be undeserved. Many of his citations are to his own work, and flimsy work at that: brief conference presentations Corchado uploaded to his website and then referenced, as we were first to note in 2022. The case has now attracted the attention of the Spanish Research Ethics Committee, which “has urged the University of Salamanca to exercise ‘its powers of inspection and sanction’ in the face of ‘the alleged bad practices’ by Corchado.”

Why would such bad practices help Corchado and the university? Because so much of the various ranking systems’ rubrics — factors that help determine funding from government agencies as well as competition for student enrollment — are based on citations, which are particularly easy to manipulate. In other words, the better individual scientists look on paper, the better their institutions appear.

The Corchado case is just a high-profile example of what the obsession with metrics has wrought. In Vietnam, researchers are abuzz about a recently released ranking system, but media reports find the scheme “chaotic” and full of errors. Last week, The Economist published a fawning piece about science in China. “China has become a scientific superpower,” the magazine declared, and “tops the Nature Index, created by the publisher of the same name, which counts the contributions to articles that appear in a set of prestigious journals.”

What The Economist left out — but has noted before, however — is that China is responsible for well over half of the world’s more than 50,000 retractions, a dubious distinction that can be traced directly to the country’s laser focus on metrics. Until the practice was officially banned in 2020, researchers in China were paid rich cash bonuses for publishing in journals that count in the Nature Index, and clinical faculty at medical schools — whose jobs do not involve research, and who do not train for such work — were required to publish papers to earn their jobs and be promoted.

Those incentives were essentially direct invitations to commit fraud, as a recent survey of researchers in China demonstrated. How else were academics supposed to maintain their careers except increase their output, create citation rings, or even turn to a thriving paper mill industry?

While it is easy to blame the Chinese government for the citations arms race, universities have done nothing to push back, and in many cases have even encouraged the system to work exactly the way it does. In India, for example, a dental college came up with what one critic called a “nasty scheme” involving self-citation to boost itself to the top of the rankings in its field. In Saudi Arabia, some universities apparently hired prominent mathematicians as honorary faculty just so their citations would count toward their institutions’ rankings.

Which brings us back to Corchado. Why he cited himself so heavily is not clear, because he never responded to our requests for comment two years ago except to say he’d broken his arm and would be slow to reply. But at the time Alberto Martín-Martín, an information scientist and bibliometrician at the University of Granada, noted Spain still focuses heavily on the Journal Impact Factor to evaluate the production of its researchers, even more so than in some other countries.

In a way, the public should thank Corchado for raising alarm bells at EL PAÍS and the Spanish Research Ethics Committee. Whether he remains as rector of the University of Salamanca is less important than whether the episode prompts real change in Spain — and elsewhere. Movements afoot, including the Declaration on Research Assessment (DORA) and the Leiden Manifesto, to encourage a move away from citations and other metrics and toward strategies that reward the kind of research culture we want and need.

Universities and governments have a chance to reform their evaluation strategies before things get even worse. They can replace them with the good old-fashioned way of assessing researchers’ work: Reading it.

Adam Marcus and Ivan Oransky are the founders of Retraction Watch, an American organization specialized in scientific fraud.

Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition

More information

Archived In