What Motivates Reviewers? An Experiment in Economics
Shorter deadlines, email reminders, and cash incentives can speed up the peer review process and minimize unintended effects, a recent study suggests. Can it work for other disciplines?
Shorter deadlines, email reminders, and cash incentives can speed up the peer review process and minimize unintended effects, a recent study suggests. Can it work for other disciplines?
Peer review, journal reputation, and fast publication were selected by Canadian researchers as the top three factors in deciding where to submit their manuscripts, trumping open access, article-level metrics, and mobile access, a recent study reports.
If we were to build a citation reporting system today, what would it look like? In this post, I propose a solution that would do away with a separate Journal Citation Report (JCR) and propose a suite of services built around the Web of Science, directed to the needs of journal editors and publishers.
Are authors leaving PLOS ONE for higher performing journals?
When novel, newsworthy results are discovered to be wrong, is that still news?
Adam Etkin describes the workings and rationale for scoring papers and journals based on the rigor of peer review they received prior to publication.
A new study reports on the usage half-life of articles in thousands of academic and professional journals. The results may help in the formation of public access policy and the setting of access embargoes.
One month since Science Magazine published its exposé on the lack of peer-review in, and deceptive business practices of, many open access journals, investigative reporter, John Bohannon, responds to critics.
Providing free access to online books has no effect on sales and citations but increase online downloads. A critical review of the study’s research proposal three years ago foretold these very results.
What can be learned from John Bohannon’s investigative study of open access publishers?
The design and construction of article performance measures can reveal deeply held biases.
Revisiting a post from 2011 that called for evidence for a better understanding of access to the research literature.
An animated bubble plot of nearly four-thousand biomedical journals over ten years reveals success, decline and the shifting nature of science publishing.
Are we witnessing the decline of the open access megajournal and a return to a discipline-based model of publishing?
Authors should not be surprised when their open access articles show up in surprising places. Is it possible to embrace open access with some restrictions?