English: http://en.wikipedia.org/wiki/Image:St...
English: http://en.wikipedia.org/wiki/Image:Stopwatch2.jpg (Photo credit: Wikipedia)

PLoS ONE has been one of the signal experiments in scientific publishing since it came into the journals market in late 2006. By promising authors three distinct advantages — unfettered access for readers, a higher likelihood of acceptance, and faster times to publication — PLoS ONE rapidly gained traction as a place for authors to publish their works more quickly and with less hassle.

Speed has become a dimension of publishing competitiveness over the past decade. Search Google for “journal fast publication,” and you get a long list of publishers — most newer, but some traditional — competing on shorter publication times. PLoS ONE shows up 5th on the list, behind BioMed Central, Springer’s Fast Track publishing (an approach to reducing time from acceptance to publication to less than 20 days), BE Press, and Copernicus Publishing. Fifth position is a high ranking in the Google, given that the other listings are for publishers, while PLoS ONE’s is for a single title.

Before the move to online publication, even contemplating a faster publication cycle seemed doomed — there were low expectations among authors; printing and mailing added major barriers; and the pace of publishing was slow to the point of being almost languid. But soon after journals moved online, various practices — online-first, pre-publication versions, and general improvements in speed thanks to email instead of FedEx — became possible.

Some journals have made their speed a source of competitive differentiation and pride, becoming known on the market for rapid publication. PLoS ONE is one of these, and boasts about its speed without qualification:

Results published FAST

PLoS ONE couples efficient and objective peer review with a streamlined electronic production workflow.

This claim is asserted elsewhere, with promises of “Fast publication times” and the like.

But is PLoS ONE delivering?

Sample data comparing speed between 2006 and 2011 suggest that PLoS ONE is slowing down significantly. Taking 10 articles published at the end of September 2011 and their times between submission and acceptance, acceptance to publication, and then overall (submission to publication), then comparing these to 10 articles published at the end of 2006, the data indicate a significant slowdown in every aspect of PLoS ONE’s process:

  • Average times from submission to acceptance increased from 58 days to 161 days
  • Average times from acceptance to publication increased less, from 36 days to 52 days
  • Overall, times from submission to publication increased from 94 days to 213 days, or from 3+ months on average to 7+ months on average
  • The maximum time to publication increased as well, from 141 days in 2006 to 330 days in 2011
  • The fastest time to publication in 2006 was 18 days, while in 2011 it was 42 days
  • The maximum time to publication increased 234% between 2006 and 2011, while the minimum time to publication increased by 163%

Judging the increases in the standard deviation in each little data set, the main problem seems to be that the reviewer pool isn’t able to keep up with the volume of submissions, and may not be growing as quickly as the rate of submission. The standard deviation of the time between submission and acceptance has increased nearly three-fold between the 2006 articles and the 2011 articles sampled.

Another 10 articles sampled from 2010 show that the slowness is on the same line the two points of 2006 and 2011 suggests, with the time between submission and acceptance contributing the most consistent data, while time in the production system shows some deflections — what I take as signs of efforts to speed things back up once papers got into production. This makes sense, as production people are trained to manage variance, and they work in a more controlled and controllable environment. Nonetheless, the slowness is creeping in even there.

These data put PLoS ONE on track to be about average for speed compared to many other journals, including their competitors in the open access space.

Just to make sure there wasn’t a “summer doldrums” effect in here somewhere (after all, the sample data came from articles published in late September), I looked at 10 more articles published in late June 2011. There may be a little effect around summertime schedules, as the average time from submission to acceptance in the June sample set was 127 days (compared to 161 days in the September sample). However, time from acceptance to publication was actually higher in the June sample set (58 days compared to 52 days), and the overall time between submission and publication in the June sample set was 185 days, so about 6 months.

A complete study of article submission, acceptance, and publication dates would be quite interesting, especially if it covered all the years of PLoS ONE’s publication. Based on these admittedly sketchy data, I’ll bet such a study would show a gradual northeast slope to the overall data as times in all three bins slowly creep upward over time.

It’s also worth noting that many journals no longer publish submission and acceptance dates along with their manuscripts. By doing this, PLoS ONE should be commended for backing up its claims about speed with information users can access to see for themselves.

It isn’t unusual for the review process to become a source of delay — most journals struggle with coordinating peer-review, keeping things moving along, and adding reviewers to keep up with the supply of papers — but PLoS ONE is growing so fast that the effect is probably especially pronounced. And when a process expands before the most devoted adherents to it to a slightly less engaged group as seems inevitable when a journal grows, delays creep in.

One way many journals have dealt with this is to create a fast-track option at some point, a way for selected papers to move more quickly through peer-review while those without the need for speed come along at a more normal pace. This lets the editors compete for top papers more effectively. But the fast-track aspect that has emerged at many top-tier journals may not exist at PLoS ONE, possibly because they’ve promised that everything would be fast.

As PLoS ONE has grown, it’s not surprising that the review process has slowed while the production system is also bogged down trying to cope with the throughput.

After five years of impressive growth, PLoS ONE’s publication times are looking a bit more average. And while they’re faster than some journals’ publication times, the trend isn’t headed in the right direction, at least if these admittedly limited data are representative. The trend is toward months of waiting for authors — toward becoming more like the journals PLoS intended to compete against.

Enhanced by Zemanta
Kent Anderson

Kent Anderson

Kent Anderson is the CEO of RedLink and RedLink Network, a past-President of SSP, and the founder of the Scholarly Kitchen. He has worked as Publisher at AAAS/Science, CEO/Publisher of JBJS, Inc., a publishing executive at the Massachusetts Medical Society, Publishing Director of the New England Journal of Medicine, and Director of Medical Journals at the American Academy of Pediatrics. Opinions on social media or blogs are his own.

Discussion

16 Thoughts on "Is PLoS ONE Slowing Down?"

As a consultant I bootleg a basic research program, which I do not publish in scholarly journals, in part because of the hassle factors Kent is describing. Kent’s precedent is therefore very exciting, and I look forward to publishing some hopefully useful results. For example, I recently did a quick study that suggests that 60% of citations occur in the first 25% of a journal article. This is where the problem being addressed by the research is explained. These citations are basically journalistic, so they are not necessarily a measure of impact.

This is a sound point, but I’m not sure that it necessarily takes the wind out of PLoS’ sails. Among my colleagues who have published in PLoS, there’s a general sense that the process does take a good deal longer than you’d expect from an internet-only model. For me, this serves to reinforce the impression that the respective PLoS titles are, in fact, regular journals (with regular, albeit impressive, bespoke impact factors), irrespective of their having established a very successful Open Access cost model. If their own data softens their PR claims a bit, so be it — PLoS is still comfortably faster than a traditional print cycle, and comfortably slower than any venue which does not offer pre-publication peer review.

Regardless of their publication volume (that is, “factories spewing pollution” notwithstanding), it’s important to keep in mind that PLoS hasn’t really done much to change *how* we publish; only the underlying economics. Given the recent string of imitators, I’d say they’ve been a terrific success in that regard, and we can look forward to the net economic benefits quite independently of any other “new” publication models.

Alex- I don’t agree that PLoS is comfortably faster than any print cycle- the biweekly journal I run has an average time from submission to first decision (for reviewed papers) of 46 days, and an average of another 46 days from acceptance to online publication. Print publication after that takes an extra month. Average time from submission to online publication for April 2011 submissions was 186.5 days; all these are compare well to the numbers above.

Comparing total times from submission to publication is a bit dicey, as these times include both the journal’s effort (submission to decision, acceptance to publication) and the amount of time the authors spent revising their paper. The journal clearly cannot control the latter, so it’s not entirely fair to include it. Furthermore, editorial decisions that demand more changes from the authors necessarily take more time to implement, and sometimes there is a second round of review. In fact, it may be that the increased variance and higher mean in submission to publication times reflects a greater proportion of 2011 papers having a second round of review.

It seems like a good thing if PLoS One is giving out more demanding editorial decisions and sending more papers for another round of review- the quality of the scientific record hinges much more on the quality of their output rather than a difference of a few weeks in their decision times.

I agree with this analysis up to some extent. It is indeed true that there is some slowness creeping in with the submissions skyrocketing. However, it is not a good idea to take examples from July to . October and January. It is the worst period where pile ups are huge due to holidays in Europe and Christmas/New Year.

I compared the first issues of PLoS ONE, where you would think speed would be at a premium, to three different small sets of data, none of which was in January. I checked for the “summer doldrums” effect, as I explained in the post, and there was a small effect, but not much in the areas that mattered. If you’d like to re-read the post and let me know if I missed something, that’d be great.

Interesting analysis but with 10 observations per time period your results not very stable hence I question how trustworthy your conclusions. You may well be right but you should at least calculate some standard errors and even those would be a bit suspect with 10 observations.

Also, if you are really interested in this, it might be a good idea to pull data that is a little later than the 2006, maybe spring of 2007. That is right when PLoS One started and probably not the best measure of how the review system worked once it stabilized.

I am actually harvesting data from the PLoS articles via software for another research purpose but I am capturing the submission, acceptance and publication dates.

I just perused through the Jan 2011 data, all 895 records. What strikes me is the amount of variability in the submission to acceptance times.Some times nearly as long as a year others or only about a month. As someone noted above, this could be as much due to authors procrastinating on revisions as the peer review process. Acceptance to publication looks to be pretty consistently about a month. Again this is just by glancing through the data in a spreadsheet like view into the data base and it is from January of this year. At some point when I get time, I’ll calculate some good statistics on this issue and report it.

Here’s more comprehensive data, comparing Jan and Feb 2010 with Jan 2011. A much more compressed time comparison but a period where the number of articles published in PLoS One per month nearly doubled. Almost no difference in submit to accept and a 10 day increase in accept to publish times. These statistics include every publication during those two periods.

http://dl.dropbox.com/u/9300438/PLoS%20One%20Review%20Publication%20times.pdf

These results are in the middle of two time periods above and perhaps the time increases leveled off and then went up again in the last 9 months of this year There could also be significant sampling error in the very small samples used to calculate the original statistics.

Perhaps there’s some threshold that was reached in 2011. Remember that in 2010, PLoS ONE published around 6800 articles, and is on pace for 12000 in 2011. That’s a significant increase since your last data point so I’m willing to bet there’s likely some change in performance associated with a 57% increase in output.

David, I think that is a reasonable hypothesis but it turns out not to be correct. Your post motivated me to collect some more data since I had already written the software to harvest the metadata from PLoS One. I captured the metadata for articles published in Sept 2011, the most recent month for which complete data are available. The results can be accessed at:

http://dl.dropbox.com/u/9300438/Sept2011.pdf

The submission to publication time did go up a few days and the accept to publication dropped back down to about 30 days.

Perhaps they had shorter review times initially when the journal first launched. I don’t want to spend the time going back and capturing that data. But the assertion PLoS One review/publication times have been slipping as their publications have be going up exponentially is just not true.

David,

Thanks for these data. They’re more complete than mine, and that’s good. However, these are still in the 2010/11 range, and not comparing those data to earlier data, as you indicate. It’s also interesting that my samples of a week late in the month (so, ostensibly 25% of your sample) varied so much from what you saw for an entire month. I wonder if there’s some batching process at PLoS ONE that would make the end of the month slower, a sort of “clean up” week.

Interesting data. If you have more, please share.

Thanks.

Thanks for the additional data. One thing that’s interesting is that whether it has slowed down or not, we’re talking about an average of around 140 days from submission to publication, and that’s still 3 or 4 weeks slower than the small sample size data showed for other more selective journals that have a publish-ahead of-print option (by my counts from the 10 latest published articles, PNAS averaged 111 days, NAR 119). The 107 day average for submission to acceptance seems to show that the streamlined peer review requirements don’t greatly shorten the process, and the real speed comes in the time from acceptance to publication, which is more a function of an online-only journal without page limits than it is the review guidelines.

As Kent notes, it would be interesting to see how that 140 days compares with the journals earlier years when the total article count was lower.

Ken, you are welcome. I don’t think that is the case. I selected out the articles from Sept 23 – 30. The times went up a few days but that is it. My guess is the discrepancy is due to sampling error. Again, 10 is a very small sample and it’s a pretty wide distribution. There are outliers at both ends of the spectrum which isn;t too surprising. One or two in a sample of ten could through the results off by quite a bit.

http://dl.dropbox.com/u/9300438/sept23-30.pdf

Below is a link to raw data in spreadsheet format for that week..

http://dl.dropbox.com/u/9300438/sept23-30.xlsx

I sorted it the data by submit to accept times.

Kent, I think you’ve missed a key point here in favor of the OA Megajournals and their speed–the real advantage they have comes not from a rapid peer review process (using your methodology, PLoS ONE’s process takes as long as a small sample from Cell or Nature Cell Biology), but instead from a high acceptance rate. An author knows that he will likely only have to go through one tedious review process for a high-volume megajournal, as they’re going to accept 70% of what’s submitted. Other journals may have faster peer review and publication turnarounds, but because they’re more selective, the article has a higher chance of being rejected and will need to start over from scratch.

Some other thoughts doing a quick analysis using your (admittedly flawed due to small sample size) methods on a variety of journals to get a quick snapshot of things:
The other biology OA Megajournals, G3, Nature Communications, Scientific Reports, BMJ Open (but not Open Biology as they’ve yet to publish any papers) are all much faster than PLoS ONE at this point, suggesting that volume is to blame here. If speed is your top priority, these appear to be your fastest way to get something published.

But “published” is a tricky word–if one looks at other journals that do “publish-ahead-of-print” (PNAS’ “Early Edition” or OUP’s “Advance Access”) and if that is considered “published” as the article is publicly available rather than appearing in an actual issue, then journals like PNAS or Nucleic Acids Research are right up there with the fastest megajournals without sacrificing quality selection.

It’s an interesting thing, how this OA sheep gets shaved down. First, citation advantage made its woolly debut. It has since been disproven, but for a while was the primary fuzz on the OA animal. Speed to publication in the sense of “this journal is fast” and not meta speed, has been another advantage used to blanket the prospective author with warm fuzzies. Now, the speed advantage at a publication is likely to give way to a speed advantage in general, which makes the mega OA journals more susceptible to competition, especially from houses that do cascading peer-review (who will have essentially the same acceptance rate advantage). And, as you note, a lot of high-quality journals can get certain stuff out faster, which is why I mentioned that an overall pledge of speed may actually make you slower — while a fast-track allows discretion and to get the most important authors and papers through faster. This fast-track approach becomes even more important when volume increases, as you note.

But since every post can’t cover every thought, I had to limit this to a quick “Is PLoS ONE living up to its promises of FAST publication?” test. Is it less of a hassle for some authors? That’s a different question, and one that leaves the door to competition open.

I’m not sure it’s fair to put this on all of OA, as highly selective OA journals like PLoS Biology or PLoS Medicine never promised the sorts of speed to publication offered by the high volume megajournals. I think it’s important to separate the access model from other aspects of the editorial and publishing process, not every OA journal is a megajournal. There seems to be a definite speed advantage for an online-only journal, particularly one with no page limits or limits to the number of articles that can come out at a given time. These sorts of journals (regardless of the access model) appear to have much faster turnaround numbers post-acceptance, as they don’t have to wait for space in an issue to open up. The same goes for publish-ahead of-print methods as well.

That said, it will be interesting to see if the slowdown you noted is an endemic problem of volume, or if it can be remedied by increasing staff levels, finding ways to automate parts of the process or convincing reviewers to turn things around faster (good luck with the latter).

Comments are closed.