Open peer-review does not affect the quality of reviews, but it does result in significantly longer reviewing time and makes it harder to recruit competent reviewers, a new study reports.
The article, “Effect on peer review of telling reviewers that their signed reviews might be posted on the web,” was published on November 16th in BMJ by Susan van Rooyen and others.
Using a randomized controlled trial, reviewers were allocated to either the intervention group () or the control group (reviews provided to the author). But even getting to this stage was difficult. A full 55% of reviewers refused to participate in the open peer-review experiment.
When reviews were completed, editors rated the quality of both the open review and control group as equal. Telling reviewers that their report would be publicly available did not harm the quality of the report, nor however did it lead to its improvement.
Yet reviewers in the open peer-review group reported that their task took an additional 25 minutes on average to complete, and 41 additional minutes for papers that were eventually accepted. The authors write:
Reviewers who knew that their report might be posted online spent longer on the task than those in the control group, so adopting open peer review might result in the process feeling even more arduous to reviewers than it currently does. This is a concern because willing reviewers are already the scarcest component in the peer review process.
These results should not be controversial.
Bernd Pulverer, head of scientific publications for EMBO, reported recently in Nature that requiring reviewers to create publicly-visible reviews that would accompany the published paper adds significant time to the process, while adding nothing to the quality of the review.
More importantly, requiring open peer-review may make it more difficult to attract competent reviewers, the best of whom are overloaded with requests. A related experiment on open peer-review, in which reviewers were told that their identities would be revealed to the author (but not to the public), resulted in more declines to review and no difference in the quality of the reviews. A similar study in a specialist journal reported that signed reviews were of slightly higher-quality, more courteous in tone, took longer to complete than unsigned reviews, and were more likely to recommend acceptance, indicating that social factors may have been in play.
The latest study, published just last week, relies on data collected between 1999 and 2000. The delay in publication was due, in part, to the death of the lead author, Susan van Rooyen. I asked Tony Delamothe, deputy editor of the BMJ and corresponding author for the paper, whether the results of their study would look different if conducted today. He responded that open-peer review was a pretty radical idea more than ten years ago, but as a result of changes in BMJ and other major journals over the last few years, Delamothe doesn’t think the idea of open peer-review is as controversial today. He would expect fewer rejections to review but no change in the quality of review. “The consensus seems to be in favour of more openness,” Delamothe believes.
In spite of the costs, there are clearly benefits to opening the “black box” of the peer-review process, the authors write. Potential authors would be able to glimpse into what peer-review actually means for a journal and how much value is added between initial submission and publication. More importantly, posting the signed comments beside accepted papers means that reviewers may start receiving credit for their contributions to science. Open peer-review — and the transparency it brings to the process — is clearly a case of weighing the costs and benefits.
The question is whether the benefits of open peer review are sufficient to outweigh this price of extra time and the associated reluctance of some reviewers to participate.
- Is Open Peer Review the Fairest System? Yes | No (BMJ)
- The Price of Transparency and Peer Review (scholarlykitchen)
- Are Peer-Reviewers Overloaded? Or Are Their Incentives Misaligned? (scholarlykitchen)