COPE Case #18-03, “Editors and reviewers requiring authors to cite their own work” reads like a political thriller:
Working alone late one night, a staffer stumbles upon a decision letter in which a handling editor instructs an author to cite some of his papers. Intrigued, the staffer digs deeper and finds a pattern of systematic abuse that involves a gang of crony reviewers willing to do the handling editor’s misdeeds and evidence of strong-arming authors who put up any resistance. The staffer brings the ream of evidence to the Editor-in-Chief, who goes to the editorial board. Confronted by questions to explain himself, the handling editor resigns out of haughty indignation. Case closed. Or is it?
All COPE cases are public, however, the texts are carefully edited to preserve anonymity. COPE is an industry advisory group, not a court of law. The purpose of publicizing cases is to educate, not adjudicate. We can only hope that the summary of actions provides a clear path of action for future staffers and editors dealing with similar cases of misconduct. Still, it makes me wonder just how common is editorial misconduct and whether the vast majority of cases, like similar power-abuse misconduct, goes unreported.
A 2012 survey of social sciences authors published in the journal Science, reported that one-in-five respondents said they were coerced by journal editors to add more citations to papers published in their journal. Not surprisingly, lower-ranked faculty were more likely to acquiesce to this type of coercion. A follow-up study in PLOS ONE confirmed that the practice of requesting additional citations to the journal was prevelant across disciplines, although much more frequent in marketing, information systems, finance, and management than it was in math, physics, political science, and chemistry. In these studies, the researchers limit coercive citation to the journal itself, assuming that its purpose was to inflate the journal’s Impact Factor. But what if its purpose was also to inflate citations to the editor himself or to a cartel of other participating journals?
Last year, the journal, Land Degradation & Development, underwent public scrutiny when its editor, Artemi Cerdà, was accused of coercing authors to cite LDD and his own papers. Within a few short years, the journal’s Impact Factor rose from 2.058 in 2013 (the year Cerdà took charge) to 9.787 in 2016, or from rank #12 among soil science journals to #1. Oddly, Clarivate celebrated LDD as one of the “world’s most influential journals of 2017.” Cerdà was forced to resign in early 2017.
Stratospheric increases in individual citation counts don’t necessarily mean that misconduct has been committed. Some authors are fortunate enough to publish a fundamentally important paper that quickly becomes highly cited. While individual papers can lead to exponential growth in author citations, single papers have little (if any) effect on an author’s h-index, a measure of an author’s entire portfolio of published papers. Thus, we could imagine that a better indicator of coercion would be detected for authors whose h-index suddenly goes from linear growth to exponential growth.
Computationally, looking for rapid rises in citation performance is not difficult — the financial market has many algorithms that look — in real time — for odd patterns that trigger buying or selling decisions. Some of these transactions are investigated by the US Securities and Exchange Commission when made by individuals with cosy relationships with the company. Given that the citation market is so much smaller and less complicated than the financial market, searching for unusual patterns in the citation literature should be much easier. Metrics and analytics are becoming big business in publishing, offering tools for evaluating authors, journals, and institutions. I’m somewhat surprised that no one has stepped in this arena offering tools for the purposes of forensics.
I have to admit that I’m a little uneasy with developing such citation pattern detection algorithms as it assumes that there is something unseemly to be found in the data. An unusual growth pattern is simply an unusual growth pattern. Nevertheless, if citation coercion is much more widespread that reported because of a fundamental power unbalance within academic publishing, providing more transparency may help staffers and victims of citation coercion speak out.