A problem in recruiting competent peer-reviewers may be the fault of email spam blockers, not the unwillingness of academics to review, a new study reports.
In a correspondence to Nature, (“No crisis in supply of peer reviewers“ 23 Dec.), the editors of Molecular Ecology report that it was harder to recruit reviewers in 2010 than it was in 2001. Editors had to send out more than two requests, on average, for every one acceptance, compared to 1.4 in 2001.
The increase in unsuccessful requests, however, was not gradual but jumped (Figure 1) just as the journal moved from sending personal requests to letting the editorial system handle the emails automatically. The researchers surmise that many of these requests may not be reaching their intended target because they are being tagged as spam.
Unsolicited email comprises the vast majority of email traffic — 97% according to a 2009 report by Microsoft. As our software becomes more sensitive to detecting the patterns of email spam, it inadvertently sends some important correspondence directly to the trash.
The editors also found that the pool of reviewers scaled with the growth of submissions, meaning that the number of reviews per reviewer was kept constant over time. Reporting their calculations:
Each Molecular Ecology submission has an average of 4.5 authors and decisions are based on an average of 2.7 reviews, so only 0.6 reviews per co-author are required to compensate for the review burden of each new article. These figures indicate that the reviewer pool still seems able to accommodate the increasing number of submissions.
The real contribution of this paper is not the details from one scientific journal — although these alone are extremely helpful — but the reminder to avoid making fateful conclusions about peer-review and the state of scholarly communication without a careful analysis of the data.
Related articles
- Are Peer-Reviewers Overloaded? (Scholarly Kitchen)
- Privatizing Peer Review- The PubCred Proposal (Scholarly Kitchen)
- The “Burden” of Peer Review (Scholarly Kitchen)
Discussion
9 Thoughts on "Trouble Recruiting Peer-Reviewers? Blame Spam!"
I can confirm this from my own anecdotal findings as a journal editor. The number of requested reviewers who don’t respond at all generally outnumbers the number who immediately say “no”, and when contacted directly (not through our automated system), they almost always respond stating that they never received the first (automated) request.
As an acquiring editor for scholarly books, I am always seeking reviewers and have experienced no problem with reaching them via e-mail. I cannot recall any time when a reviewer complained that my e-mail ended up being filtered out as spam. I wonder why this would happen in soliciting reviews for journa articles and not for monographs?
When soliciting book author/editors, I tend to send out a personalized letter. For the journal, given the quantity and the tracking issues, everything runs through an automated system (Highwire’s BenchPress in my case).
This appends things like the manuscript number, and some generic text about our review process with instructions and links for how to access the system. I’m sure there’s something in the language or the format that gets picked up by spam filters, but I’m not sure what.
Sandy,
I don’t think email is the problem, but the source of the email. Software servers and clients often treat personal email from Sandy Thatcher at Penn State differently than a message (embedded with several URL links) generated automatically from an editorial management site. The latter are often tagged as spam.
I think spam blockers also detect patterns in the message text itself. If a friend sends me a message saying “check out this link,” followed by a URL, it goes straight into my spam filter — even though it was a personal message sent only to me by a trusted sender. Spam detectors have figured out that the “check out this link” gambit is a common tool of malware distributors.
Thanks for writing about our piece- I agree that the links we include in the invitations are the chief suspect, as the link-free reminder email we send does get through more often. We would remove the accept/decline links, but the people that do actually receive the email find these very useful. It’s a bit of a bind…
Kudos to Tim Vines for publishing these data. As has been argued elsewhere (http://www.ipetitions.com/petition/fix-peer-review/blog/2983), it would be of great interest if many journals would do the same.