With the speed of communication today, researchers, authors, and grant funders are impatient to get an indicator of its value. Waiting 1-3 years for publication and citation seems interminable. Conflating an article’s impact with its journals’ impact creates uncertainty, as well.
Altmetrics attempts to close that gap by providing more timely measures that are also more pertinent to the researcher and their article. Use metrics from downloads and blogs, and attention metrics such as tweets and bookmarks, can provide immediate indicators of interest. Although metrics associated with these activities are in the developmental stage, there is growing investment in the broader landscape to produce more current metrics that serve the researcher, their communities and funding agencies.
In January, the Chronicle of Higher Education highlighted the work of Jason Priem, a PhD candidate at the School of Information and Library Science at University of North Carolina-Chapel Hill, who coined the term “altmetrics.” In his post, “Altmetrics: a Manifesto,” Jason noted the limitations and slowness of peer review and citations. He suggests that the speed with which altmetrics data are available could potentially lead to real-time recommendation and collaborative filtering systems. Jason and Heather Piwowar, who works at the Dryad Digital Repository, created Total-impact as a prototype in their spare time last year. Two months ago, they received a grant from the Sloan Foundation, and next month Heather will be working full time to more fully develop it and provide context on the data set.
While it may be easy to dismiss the idea that social media metrics can be meaningful for scholars, PLoS has been developing a suite of measures over the last three years referred to as article-level metrics (ALM) that provide a view of the performance and reach of an article. Their approach is to present totals of multiple data points including:
- Usage data (HTML views and PDF downloads)
- Citations (PubMed Central, Scopus, Crossref, Web of Science)
- Social networks (CiteULike, Connotea, Facebook, Mendeley)
- Blogs and media coverage (Nature, Research blogging, Trackbacks)
- Discussion activity on PLoS (reader’s comments, notes and ratings)
Martin Fenner, an MD and cancer researcher in Germany, is working full-time as technical lead on PLoS ALM as of this summer. He brings experience as the creator of ScienceCard (author-level metrics) and is involved with the Open Researcher and Contributor ID (ORCID). As Cameron Neylon said in his 2009 article with Shirley Wu in PLoS Biology, “The great thing about metrics . . . is that there are so many to choose from.”
So which measures matter? Earlier this year, Phil Davis questioned Gunther Eysenbach’s assertion that tweets can predict citations in his article. Mendeley data, however, appear more relevant, and several research papers presented this year show a strong correlation with citation data. In fact, patterns of use indicate that some papers are widely shared but seldom cited while others are frequently cited but appear to have limited readership. In a recent presentation, William Gunn of Mendeley noted looking ahead that:
Useful as these new metrics are, they tell only part of the story. It’s a useful bit of info to know the volume of citations or bookmarks or tweets about a paper or research field, but the real value lies in knowing what meaning the author intended to express when he linked paper A to paper B. Through the layer of social metadata collected around research objects at Mendeley we can start to address this challenge and add some quality to the quantitative metrics currently available.
A growing community is forming around the topic and the conversation in June at the Altmetrics12 program focused on exploring the use of emerging tools and sharing findings of research results. As part of the Association for Computing Machinery (ACM) Web Science Conference, this daylong workshop attracted 60 very active participants in this budding community. Keynotes by Johan Bollen (Indiana University) and Gregg Gordon (SSRN) were accompanied by discussions of research and demonstration of 11 different tools.
One of those tools, Altmetric.com, created by Euan Adie, won Elsevier’s Apps for Science competition last year and now is part of the family of research tools at Digital Science supported by Macmillan Publishers. The Altmetric Explorer tracks conversations around the scientific articles in tweets, blog posts, and news that are analyzed and represented by a score in a “donut” made of colors that reflect the mix of sources.
An important component of the altmetric community was represented by founders of two leading academic social networks: Mendeley (which also competes with citation managers) and Academia.edu (whose competition includes ResearchGate). Since these tools enable researchers to collaborate by posting and sharing their work, the data from these systems could potentially offer fertile ground for data to support the growth of altmetrics.
The most recent entrant in this arena is Plum Analytics, founded by Andrea Michalek and Mike Buschman, who were team leaders in the successful development and launch of ProQuest’s Summon. Andrea is building a “researcher reputation graph” that mines the web, social networks, and university-hosted data to map relationships between a researcher, his institution, his work, and those who engage with it. An interview with Andrea in semanticweb.com described how she is dealing with the issues of identifying a single researcher and a single article:
The Researcher Graph is seeded with departmental ontologies, document objects IDs for published articles, ISBNs for books, and other information universities typically already have data about. For the Document ID, it is creating a set of aliases and rules to find different URIs by which a work is referenced, since one paper can be living legitimately at 50 different places around the web; so, when someone tweets a link for one of these resources it will know it is the same one that might go by a different alias from another publisher.
The resulting data set could provide a current complement to the institutional citation analysis services offered by Thomson’s Research in View and Elsevier’s SciVal.
The dimensions of altmetrics extend well beyond an effort to capture social media activity and use it as an indicator of subsequent citation ranking. New services, such as Mendeley whose goal is ‘to manage your research’ and Academia.edu whose goal is ‘to accelerate research’ are tools for researchers where use data is a by-product that contributes to the scholars’ ‘footprint’. Altmetrics seeks to quantify the response to research and ultimately its influence across a global community.
Competition to secure grants and promotions are familiar drivers in the demand for metrics that represent value. While it is still early days for altmetrics, this nascent movement is gathering steam and will be the topic of many future conversations as we engage in evaluating the qualitative aspects of a new set of metrics that will — like medicine — be complementary rather than alternative.