Description: Front side (obverse) of one of th...
Description: Front side (obverse) of one of the Nobel Prize medals in Physiology or Medicine awarded in 1950 to researchers at the Mayo Clinic in Rochester, Minnesota. (Photo credit: Wikipedia)

In a recent article in the Guardian entitled, “How journals like Nature, Cell and Science are damaging science,” with a subtitle reading, “The incentives offered by top journals distort science, just as big bonuses distort banking,” Randy Schekman, one of the editors of eLife, starts out with a plaintive and humble “I am a scientist.” With such a demure start, it might seem surprising that the article itself devolves immediately afterwards into a piece that has inspired incredulous ridicule in emails, on Twitter, and in the comments on the article — not because the initial statement is false, but because the very next statement is laughable given the author and the context:

Mine is a professional world that achieves great things for humanity. But it is disfigured by inappropriate incentives. The prevailing structures of personal reputation and career advancement mean the biggest rewards often follow the flashiest work, not the best. Those of us who follow these incentives are being entirely rational – I have followed them myself – but we do not always best serve our profession’s interests, let alone those of humanity and society.

Schekman shared in this year’s Nobel Prize in physiology or medicine, certainly one of the “biggest rewards” in science. (Commendably, he donated his prize money to his university.)

Luckily, when a prize-winner of Schekman’s caliber complains about the incentives in science, people sit up and take notice. Unfortunately for him, most responses were incredulous or derisive. As one Twitter wag commented:

I refuse to be considered for the Nobel prize. Distorts science.

Other comments included:

It isn’t news that a Nobel prize winner can now manage without selective journals (except the one he edits).

If space is unlimited and elitist journals are “distorting science”, then why does eLife reject 75% of submissions?

Am waiting for one of my trainees to nervously ask whether we are boycotting Cell/Science/Nature. YES!*
*after we win a Nobel

Schekman has published 46 papers across Science, Nature, and Cell, and published an article in Science just this year. In a prime example of biting the hand that fed you, Schekman seems to demonstrate that once you win the Nobel Prize, the journals you used to establish your career and publish the findings that led to the Nobel somehow become just something to scrape off the soles of your shoes. (His example is almost as classic as Harold Varmus’, another Nobel Laureate who sought to undermine the system of publication incentives that brought his works to light.) As Schekman writes:

Like many successful researchers, I have published in the big brands, including the papers that won me the Nobel prize for medicine, which I will be honoured to collect tomorrow. But no longer. I have now committed my lab to avoiding luxury journals, and I encourage others to do likewise.

One complaint from Schekman is that these luxury journals artificially cap publication, while open access (OA) journals, which have been, in his terms, “born on the Web,” do not suffer from such unnecessary constraints:

Born on the web, they can accept all papers that meet quality standards, with no artificial caps.

Why his own journal rejects about the industry average percentage of papers may require some explanation, given that it was birthed in the digital age. This is just one of many sleights of hand Schekman or his amanuensis engages in.

Schekman is either ignorant or disingenuous in castigating the incentives involved with publication as a problem for science. Surely, some incentives (like direct payments for publishing in high impact journals) can be misaligned, but these misalignments tend to occur because academic or funding bodies introduce distortions. Schekman is unwilling to blame anyone but established publishers for these and other woes, even though publication is clearly aligned with science and the public good. As economist Paula Stephan stated in my interview with her last year, publication — and the priority thereby established — is what allows a scientist to claim a work as her or his own:

Priority “solves” the public good problem, providing a strong incentive for scientists to share their discoveries. The upside is that priority encourages the production and sharing of research. There are other positives — one relates to the fact that it is virtually impossible to reward people in science for effort since it’s virtually impossible to monitor scientists. The priority system solves this, rewarding people for achievement rather than effort.  Priority also discourages shirking — knowing that multiple discoveries of the same finding are somewhat commonplace leads scientists to exert effort.

There are many incentives in science, of course. Among them, publication is a relatively modest one for most scientists in most instances.

Other incentives can have more power. Prizes as an incentive have been used for centuries to spur scientific investigation, and there are currently dozens of large-scale prizes and hundreds of smaller prizes. Among other salutary effects, these prizes can incentivize particular types of research, establish timeframes for contests, or push research forward in particular regions.

Grants are another form of incentive, a prize if you will, and granting bodies routinely set out funding agendas that can affect the direction and pace of research.

Tenure is yet another incentive, as is academic advancement. But these incentives are barely mentioned in Schekman’s article, and those who may be mismanaging them are not castigated.

In addition, Schekman doesn’t seem to know where to assign incentives. For instance, he writes:

These [luxury] journals aggressively curate their brands, in ways more conducive to selling subscriptions than to stimulating the most important research.

Publications don’t “stimulate” the most important research or typically set any type of research agenda. In nearly all instances, publications merely indicate the field or domain being covered and wait for outputs that match. Some compete aggressively for papers, but publications and publishers are generally relatively passive when it comes to establishing upstream incentives and research agendas for scientists. Accusing publishers of neglecting a duty they have never possessed is one of many blunders in the article.

Schekman mentions that he is an editor at eLife, which as far back as 2011 stated that it was taking clear aim at journals like Nature, Science, and Cell, making it nothing better than an open access (OA) luxury journal. True to form, eLife is not beyond touting its glamorous aspects. Currently, if you visit eLife, you are first shown the layer below, clearly flaunting an incentive (the Nobel Prize) and associating this to publishing with eLife in order to make it appear to be more of a luxury journal.

FireShot Screen Capture #041 - 'eLife announces board of reviewing editors I eLife I The new funder-researcher collaboration and open-access journal for outstanding advancements in life and biomedical r

The picture above has interesting incentive suggestions in it, all of which are misleading. For instance, Schekman’s work leading to the Nobel was not published in eLife. And notice how aggressively eLife is curating its own brand, both by producing the article discussed in this post as well as by splashing their Nobel-winning editor’s visage across its site. I think someone should write an article about these shameful and vain luxury publishers using incentives to entice authors in this manner.

There are numerous factual errors and intellectual sleights of hand to be found in Schekman’s article, leading some on social media to wonder if Schekman actually wrote it or just signed off on an eLife press release.

For instance, Schekman blames publishers for the over-emphasis on the impact factor — the same sort of blame the authors of the DORA also erroneously assigned to publishers. Some publishers tout their impact factor to attract better authors and better papers, but the main culprits in overemphasizing the impact factor are academics themselves, related policymakers and administrators, and tenure, grant, and advancement committees. Yet Schekman is unwilling to look in the mirror here, and instead brushes the blame off on publications his journal is competing with.

When he was editor-in-chief of PNAS, Schekman sang a different tune, touting that journal’s impact factor in a 2008 editorial:

With a competitive impact factor of 9.6 and a 19% acceptance rate for papers submitted directly, PNAS remains one of the most prestigious and highly cited multidisciplinary research journals.

In the Guardian article, Schekman also bemoans how luxury journals can entice researchers to cut corners. But even Nobel laureates cut corners after winning the prize, as the case of Linda Buck shows. Does this mean that winning the Nobel Prize may distort a scientist’s behavior by making them sloppy or arrogant? Or perhaps the mere existence of the Nobel Prize causes misbehavior, as the story of Penn State’s Michael Mann suggests, when he falsely claimed sharing credit in Al Gore’s Nobel Peace Prize.

There just seems to be no getting out of this series of distortions via incentives. If you win, you might misbehave. If you want to win, you might misbehave. And whether you win or not, you might cut corners to make further positive outcomes more likely.

Of course, Schekman works at the University of California, Berkeley, a luxury university, as noted on the Telliamed Revisited blog, where the writer cleverly reworks Scheckman’s article into a diatribe against institutions that distort behavior with their powerful brands. It’s reminiscent of a brouhaha that erupted in August around a young scientist who was accepted by a “glam journal” and who was castigated by Michael Eisen for not choosing an OA journal. She felt Eisen was being a unrealistic, and others in the comment stream pointed out that Eisen’s lab routinely demonstrates a preference for scientists published in glamour journals when it does its hiring.

One corner that Schekman did cut in this article is one that has been cut routinely with eLife — namely, the conflict of interest corner. As a scientist, he surely knows that it’s important to declare entanglements that might affect your statement or judgment. Being a well-paid editor of eLife, which has been designed to compete with the luxury journals, certainly is a conflict of interest in an article calling for a boycott of his ostensible competitors. Yet, Schekman doesn’t say that his criticisms of Nature, Science, and Cell should be viewed as those of a straight-up competitor interested in weakening those journals. Schekman has been evasive before about eLife’s involvement in sketchy behavior. In February, he publicly threw PubMed Central and David Lipman under the bus for the eLife scandal, despite plentiful public evidence showing complicity between eLife, Wellcome, and PMC. When a self-proclaimed scientist takes the convenient route rather than the path of evidence, I worry a little.

There are times when this article reads like it was cooked up in the offices at eLife and put under Schekman’s name. Speculation in the Twitterverse also picked up on this idea. In fact, by heading just another (aspiring) luxury journal, Schekman is culpable for simply extending the problems he decries here. OA does nothing to change the incentives he’s complaining about, and may actually exacerbate them if competition for grant funding increases because grant dollars become scarcer (since more go to paying APCs) or if grants become more centralized at larger institutions or among more senior researchers. The problems with distorted and exaggerated incentives reside among academics and their institutions, not among journals.

For someone who is ostensibly so worried about distortions, it’s ironic Schekman or his amanuensis have distorted so much in order to make rank competition appear to be pure and informed scientific idealism. When a multi-year goal of eLife has been to become the luxury OA journal and unseat at least one of the three journals listed in the article’s headline, it seems that at least for now, the high level of hypocrisy demonstrated by that journal’s editor takes the prize.

Enhanced by Zemanta
Kent Anderson

Kent Anderson

Kent Anderson is the CEO of RedLink and RedLink Network, a past-President of SSP, and the founder of the Scholarly Kitchen. He has worked as Publisher at AAAS/Science, CEO/Publisher of JBJS, Inc., a publishing executive at the Massachusetts Medical Society, Publishing Director of the New England Journal of Medicine, and Director of Medical Journals at the American Academy of Pediatrics. Opinions on social media or blogs are his own.

Discussion

35 Thoughts on "This Takes the Prize — Editor of New Luxury OA Journal Boycotts Luxury Subscription Journals"

Erin McKiernan says she won’t publish in Cell, Science of Nature, and no-one cares because she’s Only An Early-Career Researcher who’s never demonstrated her ability to get into them anyway. Randy Schekman says the same and you disallow his pledge because he’s a Nobel prize winner who has published in them. I wonder what kind of researcher you would consider this statement interesting from?

(For the record: I will never send my work to these journals, either.)

I don’t “disallow his pledge” (whatever that means). I actually am glad his pledge got a lot of attention, because it’s self-serving and hypocritical. I’m glad a lot of people have seen it, and seen through it. He’s an editor of a luxury OA journal (no APCs, big funder backing, sexy design, touting his Nobel to entice submissions, rejecting about the same percentage of papers as most journals), and OA does nothing to solve the incentives problems he names. So, in addition to being disingenuous, Schekman’s article is just this side of foolish.

Now you’ve taken the pledge as well?! Oh dear.

The status as a Nobel Prize winner isn’t the core issue here (though that, along with his editorship at a “glamor” journal does tend to raise the question of hypocrisy). The issue is that the pledge is misguided. The problems he’s talking about are in the academic career structure system. This is to blame and what needs to be fixed from what he’s describing. Boycotting those three journals just means three other journals will move into the vacuum created by their absence.

If you have back pain, you don’t solve it by boycotting select heating pad manufacturers. Journal publishing is a service industry. We only care about the Impact Factor because our customers (you) have told us to. Change your needs and we will change with you.

“We only care about the Impact Factor because our customers (you) have told us to.”

Ouch. There’s enough truth in that to make me very unhappy. That said:

“Change your needs and we will change with you.”

This of course is exactly what Schekman is trying to do (and McKiernan, and me).

Then you should be taking on academia, not publishers. Are you doing that?

Of course I am! Why would you doubt it?

I reject the false dichotomy here, by the way. Academics and publishers are both guilty in this vicious circle (along with some granting bodies). All of them are properly targets of criticism.

One is upstream from the other, though, as David points out. If academics started to say that traffic to their articles or Tweets meant more than citations and impact, publishers would behave accordingly. The dichotomy is real and meaningful. Your failure to appreciate and respect the structural aspects of the relationships is bizarre.

It’s widely recognised that the impact-factor vicious circle is, well, a cycle. Academics prize high impact factors, so journal trumpet their impact factors, which leads academics to believe they’re important, and so the cycle continues. All parts of that cycle are fair game. I attack it at all points. I don’t give academics a free pass; I wish you would be equally critical of publishers.

But there is a source to this river, circular though it may be. Journals are reacting to what academia asks them to react to. You are driving the bus (to really start mixing metaphors). Journal publishers can’t fix this problem for you. You and Schekman are essentially asking businesses to help change academic culture by doing a really poor job of meeting their customers’ needs. That’s not how business works.

Imaging the uproar if Elsevier declared that they were now taking charge of the tenure/hiring/funding culture for academia. Do you really want them running your university?

“Imaging the uproar if Elsevier declared that they were now taking charge of the tenure/hiring/funding culture for academia. Do you really want them running your university?”

No, I’m much happier the way things are now, with Thomson Reuters running it.

Thank you, thank you! I’m here all week! Don’t forget to tip your waitresses!

Publishers are an easy target, but in the long run, a distraction from the real work you’re proposing. It’s very easy to blame someone else, but harder to take a look in the mirror and attack your own systems (particularly when those systems have provided an enormous level of reward for oneself as is the case here). If you made Science, Nature and Cell disappear overnight, you’d just see PNAS, eLife and others move into the same slot and you’d be right back where you started.

Michael Eisen has it right:
http://www.michaeleisen.org/blog/?p=1495
“I think a better place to work is on hiring, grants and tenure.”

Authors certainly have a lot of choices, unless of course they are trying to get tenure. eLife is using Dr. Schekman’s Nobel Prize to promote their journal, no doubt. They live tweeted the entire trip to Karolinska! I think it’s fine to call out someone who has been and continues to be a significant part of the establishment when they tell others to avoid it. I would have given him more credit if his editorial was factual and made sense. I am surprised that someone who has been an editor for so long got so much wrong about publishing.

Bit of a truism, I know, but this highlights how academia is flawed more than any publication serving that market. Problem is the flaws themselves are academic and so open to plentiful interpretation, debate, misdirection and selective use of data… Still hold out hope for a more scientific and reasoned approach here, but for now politics and media presence seem to be the channels and battleground of choice .

It is interesting how often it seems like we’re dealing with a branch of cable television rather than with scientists.

Policy and science are different human systems, with different rules. Policy is an advocacy system and there are good reasons for this.

But scientists should advocate with logical, evidence-based, and sound arguments. Hence, their policy pronouncements should be both superior and more internally logical than others. However, when it comes to publication policy, the loudest voices from people donning the “scientist” mantle (in order to exploit the perception that they are objective observers) are too often the most unrealistic — or, in the case of Schekman, the most self-serving and hypocritical.

You can’t say “I’m a scientist” and then act like a heel without being called out for it. And that’s not reducible to a systems problem.

Yes, in an ideal world scientists should use logical and evidence-based arguments. But human frailty affects everyone. And let’s face it, many scientists (and not only them) think too highly of themselves and too little about what others may hear when they open their mouth.

Science has become (or has always been) too focused on the people working in the sciences rather than on the mundane task of doing science–developing hypotheses, spending long hours collecting data, and writing up results for publication.

Using rewards to acknowledge those who after much drudgery succeeded in pulling shining nuggets out of that big pan was meant not only to reward their success but also to shine the spotlight on the scientific endeavour. Since the latter cannot speak for itself, the former have taken over.

Just when I think scholarly publishing couldn’t be more convoluted and complicated, I talk to an academic and thank my lucky stars I am not in that system.

The symbiosis that ties academia and scholarly publishing evolved over many years. They gradually shaped each other and are now inextricably intertwined. Fixing one of them might, over time, cause the reformation of the other. Attempting to fix both at the same time is a daunting prospect. There are a lot of people who perceive their status and income as being threatened so it’s also very hard to generate an honest discussion of the issues involved.
That’s all a shame because there are many great opportunities to improve things in this field.

Brilliant deconstruction of Schekman’s breathtaking hypocrisy, Kent! (even if the degree of difficulty was low……..).

I believe the key point is “the main culprits in overemphasizing the impact factor are academics themselves, related policymakers and administrators, and tenure, grant, and advancement committees.” I made the same point in “We have met the enemy, and it is us”

Disclosure: I am Editor-in-Chief of GENETICS (a peer-edited journal of the Genetics Society of America)

Sheckman: “As I know from my editorship of eLife, an open access journal funded by the Wellcome Trust, the Howard Hughes Medical Institute and the Max Planck Society, they are publishing world-class science every week.”

I looked at Google Scholar and Scopus data, and I would like to point out that the world-class science that eLife has published so far will very likely lead to a first impact factor (IF) of less than 2.5, about 1 point less than the expected 2013 IF of PLoS ONE.

Disclosure: I am an editor at Nature Materials.

Ouch. Did you adjust for the fact that they’ve only been publishing for a short time?

I’d imagine the response is that the IF is a crap measure. Whilst it’s valid point, you’d still expect a journal trying to get to the top of the pile to have a higher impact factor than that, especially as they don’t publish the “slower” fields like ecology.

Disclosure: I’m executive editor on Methods in Ecology & Evolution

They can always negotiate with Thomson Reuters to have a big chunk their published items removed from the IF denominator.

I looked at all the citations to all elife’s output, and at the number of published research articles (did not count perspectives, editorials and other non-primary papers). The result of the division of both numbers is less than 2.5. I doubt that even with the citations coming in from the rest of the month they will reach this number.

How can we possibly calculate meaningful two-year impact factors for a journal that’s been publishing papers for one year?

Impact Factor is calculated as citations in year N to articles published in years N-1 and N-2. The problem with the analysis though, is that citation is a slow process. Most articles collect significantly more citations in their second year of counting toward the Impact Factor than the first. Citations in 2013 to articles published in 2013 may prove a poor proxy for the eventual Impact Factor.

@ Mike @ David

Indeed, the IF needs a two-year timeframe, which eLife has not yet reached. The number I quoted above was a prediction. I looked at the citations accumulated in 2013 for output published in 2013 for some journals that have a stable IF in the last few years, and from the correlation between both numbers I predicted the IF for eLife. Of course, the implicit assumption here is that the rate at which eLife is accumulating citations is similar to that of the other life science journals I looked at. Things may change, and we should take that 2.5 as a ballpark number.

Sorry, I did not explain it properly before.

“Publications don’t ‘stimulate’ the most important research or typically set any type of research agenda. In nearly all instances, publications merely indicate the field or domain being covered and wait for outputs that match. Some compete aggressively for papers, but publications and publishers are generally relatively passive when it comes to establishing upstream incentives and research agendas for scientists.”

As I have tried to do many times in the past, I’d like to point out that this statement, while true for journal publishing, is less true for scholarly book publishing, where acquiring editors do sometimes play a role in persuading authors to write books in the first place and, indeed, sometimes even suggest research topics and offer authors advance contracts as incentives. Advance contracts don’t even exist in journal publishing, to my knowledge.

Comments are closed.