Guest Post – A Study of Commenting on PLOS Articles

Despite the near consensus about the popularity (or lack thereof) of commenting on academic articles, there is surprisingly little publicly available data relating to commenting rates. To address this, a team of academics from the Universities of Sheffield and Loughborough have recently published research into article commenting on PLOS journals. Simon Wakeling, Stephen Pinfield and Peter Willett report here on their findings.

The Open Syllabus Project, Altmetrics, and a New Dataset

The Open Syllabus Project has created a database of over 1 million college syllabuses and extracted the names of the materials used in these courses. These materials are analyzed quantitatively and ranked. The creators of the service propose a new metric for the evaluation of academic publications.

Mr. Market is a Brilliant Editor

Of the many ways to measure the quality of a publication, one that is often overlooked are the workings of the marketplace itself. Purchases for published material is made in large part on the basis of the quality of that material, making the marketplace something of an editor of genius. This mechanism incorporates all other metrics, from impact factor to altmetrics. Unfortunately, the marketplace is not free to exercise its judgment when many participants seek dominant and even monopolistic control.

Altmetrics and Research Assessment: How Not to Let History Repeat Itself

Criticisms of altmetrics often seem to be equally applicable to other forms of research assessment, like the Impact Factor. Phill Jones suggests that is not because of a fundamental opposition to altmetrics but a fear that it will suffer the same pitfalls. The solution is to engage more with a somewhat neglected set of stakeholders; Informaticians.