Believe it or not, the current structure of journal peer review represents an optimal solution to multiple conflicting demands: reviewers should be anonymous to encourage objectivity, but someone needs to vouch for their expertise to the authors (one of the reasons we have Editors). Busy researchers rarely volunteer to do full length reviews, so someone (the Editorial Office) needs to directly invite them and keep them on track to deliver their comments in reasonable time. Having 4 or more reviews generates a lot of revision work for the authors and takes up too many reviewers’ time for just one article, so journals aim for 2-3 reviews, and so forth.
However, the current structure may only be a ‘local optimum’ – there may be a radically different but superior system out there, but we can’t reach it by tinkering with the current system. Bold experiments with peer review are therefore very welcome, as one of these might just hit upon a new and better approach to research evaluation.
One such bold experiment is Select Crowd Review, which is implemented by chemistry publisher Thieme and the commenting platform Filestage.io. Their project recently made it to the finalist stage of the ALPSP Award for Innovation in Publishing, so I got in touch with Niklas Dorn (the CEO and Co-Founder of Filestage) to find out more.
What drove Thieme and Filestage to experiment with crowd review?
The idea for Expert Crowd Review came from Prof. Benjamin List (Max Planck Institute for Coal Research). He was not satisfied with the speed of the peer review process and was looking for ways to significantly reduce review time. At the same time, he was trying to streamline the review process and simplify reviewing for reviewers.
Ben List’s idea basically was to leverage the intelligence of the crowd to review scientific papers faster. So instead of sending out a manuscript to just 2 reviewers in peer review, it would be sent to a crowd of selected reviewers that would then review the document all together at the same time. This would speed up the review process significantly because in a larger crowd of 20 reviewers there would always be enough reviewers available to review within a short timeframe.
After learning about this idea the Thieme Group started a collaboration with the online proofing specialist Filestage. Filestage then built a secure and encrypted crowd review solution and ran multiple proofs of concept with Thieme.
Could you tell us briefly how it works?
Sure, the expert crowd review solution is a secure software that runs in the browser. There is no download nor installation required.
- Editors will share manuscripts via a secure and encrypted review link with a selected group of reviewers in the browser.
- Reviewers review, comment, and approve a manuscript anonymously in the browser. Reviewers collaborate seamlessly and discuss different view points together.
- Editors have a clear overview of the review and approval status of their manuscripts at every time. The editors will see if a manuscript has been approved by the crowd or rejected with change requests.
- All comments from the crowd can be exported as one PDF and sent back to the authors for further improvements.
How are reviewers selected and invited?
The editors select the reviewers within their own database. Once the reviewers have been selected they can be invited in two ways for crowd review.
The first option is to invite reviewers via a protected review link that allows them to directly access a file, review, and approve it. The editor can share this link via email or via their own reviewer management systems.
The second option is to add reviewers with their email address to your specific expert crowds in Filestage. Once a reviewer is part of an expert crowd he/she will get an automatic invite if new manuscripts within their domain are waiting for review. Editors can nurture their crowds and add or remove reviewers anytime.
What’s a typical timeline for expert crowd review?
Reviews are completed within 72 hours instead of months before in the peer-review process.
How do the reviewer comments differ from those obtained through standard peer review?
Control experiments showed that the crowd collectively provided more-comprehensive feedback than in peer review. There are real discussions between the reviewers for the first time since they review all together in the same document. So reviewers actually reported that reviewing feels less like work because there is a real scientific discussion happening. I think herein lies a big chance for the scientific community: to have more interaction in the review process.
Are there any empirical comparisons of the two approaches?
Synlett and Prof Benjamin List have A/B tested the two approaches simultaneously. So, the same paper was reviewed in expert crowd review but also in a traditional peer-review process. Crowd review was completed a lot faster. Taken together, responses from the crowd showed at least as much and often more, attention to fine details, including supporting information outside the main article, as did those from conventional reviewers.
You can find the detailed article on the Nature blog.
Does the short timeline exclude reviewers who think more deeply about the article?
No, so far we have not seen that. If a reviewer decides to participate in crowd review then she/he has typically 72 hours time to review. This is more than enough for most of the reviewers since they rarely spend more than a few hours to review one single manuscript.
What other effects do you see?
Aside from much, much faster, better and more pleasant for all involved? Lightened workload for the editorial office, which does need to send endless streams of invitations to review, followed by ever more strident reminders to deliver.
Why do you think that other areas haven’t adopted expert crowd review?
We decided to keep our crowd review solution exclusive to hand-picked journals and publishers for the start. For us and Thieme it was important to learn from the editors and the crowd reviewers first what works well and what could be improved. With that knowledge, we have further developed the functionality of the crowd review solution.
How can a journal participate in crowd review?
We are now slowly opening the crowd review platform for more journals. Interesting journals and publishers can apply for expert crowd reviews via our website to run their own proof of concept.