Like (I suspect) many readers of the Scholarly Kitchen, I am regularly invited to submit papers to conferences on scholarly topics that have nothing to do with my areas of expertise or interest: materials science, chemical engineering, organizational behavior, climate change — a wide variety of conference programs are apparently hungry for my insights on every topic imaginable, despite the fact that anything I could write about most of them would amount to little more than gibberish.
As it turns out, there’s an explanation for this: gibberish is, at some conferences anyway, a marketable commodity.
A recent article in Nature reports that over the past couple of years, computer scientist Cyril Labbé has uncovered over 100 papers in more than 30 conference proceedings published between 2008 and 2013 that were not written by scientists or scholars, but were instead generated by SCIgen. SCIgen is a computer program created in 2005 by researchers at the Massachusetts Institute of Technology; using context-free grammar, it generates papers consisting entirely of nonsense. Because the nonsense is syntactically coherent and draws on discipline-specific terminology, it can easily be mistaken for scholarship by nonspecialist readers — particularly if those readers are not paying close attention.
“Paying close attention,” you might think, is what peer review and editorial oversight are all about. Peer review doesn’t necessarily work the same way for conference proceedings as it does for scholarly journals, but the two publishers responsible for the proceedings in which Labbé found fake papers (Springer and the Institute of Electrical and Electronic Engineers, or IEEE) are nevertheless left with lots of egg on their faces. It’s not like a double-blind review process would have been needed to catch these fakes — as Labbé points out, “the papers are quite easy to spot” and any competent proceedings editor with a modicum of expertise in the field should have been able to detect them.
More troubling than the discovery of these published deceptions is the fact that it’s not the first time IEEE has been responsible for this kind of thing. In 2012 Labbé notified IEEE of 85 fake papers in their proceedings publications, and it quickly withdrew them — but in 2013 he identified a new batch of bogus papers from the same publisher. This suggests not only that IEEE’s review and oversight processes were well below par in 2012, but that Labbé’s exposure and IEEE’s acknowledgment of the issue did little to strengthen those processes subsequently.
An additional (and even more disturbing) problem with the proceedings papers most recently discovered is emerging as the investigation continues: at least one of the authors contacted had no idea that he had been named as a coauthor. This suggests that the submissions were more than spoofs — spoofing can easily be accomplished by using fake names as well as fake content. The use of real scientists’ names suggests that at least some of these papers represent intentional scholarly fraud, probably with the intention of adding bulk to scholars’ résumés.
This development harks back to a couple of other notable sting operations intended to expose shoddy or fraudulent practices in particular disciplines or areas of the publishing community. It comes only a few months after science journalist John Bohannon published the results of his investigation into the editorial practices of 304 Open Access (OA) publishers. He submitted a nonsense paper to all of them and it was accepted for publication by just over half. The reaction to his investigation was overwhelmingly negative and defensive — and came not so much from the publishers themselves, but rather from the OA advocacy community, which tended to see his study as an attack on OA generally. For a discussion of the study and the reactions to it, see Phil Davis’s interview with Bohannon published in the Scholarly Kitchen last fall. (As an interesting side note, Davis and Kent Anderson collaborated on a similar sting operation in 2009 after receiving multiple spam invitations from Bentham Publishing; incidentally, they used SCIgen to create the nonsense paper that they offered as bait. Bentham accepted the paper after claiming to have subjected it to peer review.)
Labbé’s exposé also brings to mind a similar and more famous hoax perpetrated in 1996 by mathematician Alan Sokal, who wanted to find out whether a leading journal in the field of cultural studies would “publish an article liberally salted with nonsense if (a) it sounded good and (b) it flattered the editors’ ideological preconceptions.” As it turned out, the journal Social Text was indeed willing to publish Sokal’s piece, despite the fact that it contained such absurdist assertions as “Lacan’s psychoanalytic speculations have been confirmed by recent work in quantum field theory” and “the axiom of equality in mathematical set theory… reflects set theory’s ‘nineteenth-century liberal origins'” (and despite the fact that it explicitly questioned the reality of physical existence). Sokal’s hoax also generated defensive responses and lots of debate, but (in my view) not nearly enough soul-searching on the part of the academic community.
Springer’s response to Labbé’s exposé was just what it should have been: not defensiveness, accusations of bad faith, or attempts to shame and silence him, but embarrassment, corrections, and promises to do better. IEEE’s response (sent to me by Monika Stickel, Director of Corporate Communications, but not available online) is more vague and less apologetic, allowing that “there might have been some conference papers… that did not meet our quality standards” but also saying that IEEE “took immediate action to remove those papers, and also refined our processes to prevent papers not meeting our standards from being published in the future.” According to Nature, those articles are now gone from the IEEE Xplore database — but without any notification or explanation to readers as to what happened to them.