The National Information Standards Organization (NISO) has recently released the results of their year long study of Altmetrics in draft form for comment. This is the first phase of a two part grant funded by the Alfred P. Sloan Foundation designed to identify and develop best practices and standards that can support the widespread adoption of new metrics. A series of meetings were held in San Francisco, Washington, DC and Philadelphia and these were followed by ~30 interviews with engaged publishers, researchers, provosts, librarians and funders to discuss the value of altmetrics to each stakeholder group and the issues that need to be addressed.
As an attendee at the Philadelphia meeting and one of the interviewers for this project, it became clear that these new metrics can encourage researchers to consider the value of their work to those outside the academy. Established metrics tend to focus on activities within the academy – citation metrics represent author behavior and downloads represent reader behavior. Newer metrics are beginning to tell us about how research is engaging the general public, something that is growing in importance to funders.
The draft NISO Altmetrics Standards Project White Paper provides a state-of-the-art summary of the current state of altmetrics and is organized into nine key topics where consensus on action items would support further development. 25 potential action items are distributed across the nine topics below.
1. Definitions
Understanding alternative assessment metrics begins with the terms used to describe them. How we define and use these terms will influence the adoption of the metrics they describe. Some felt the term ‘altmetrics’ was limiting and that they should be seen as complementary, rather than as providing a replacement. Potential action items suggested focus on refining specific definitions, proper use of terms and possible subcategories as needed.
2. Research Outputs
There is a need to recognize the value of different forms of scholarly output beyond the research paper, including outputs like datasets, software, posters, theses, performances, etc. Work has been done by a few groups in Europe to better define the many types of scholarly artifacts. However, there are questions about whether all of these outputs should be tracked and measured, and whether some should be seen as more important than others. Potential action items include identifying outputs that are applicable to the use of metrics, defining relationships between the outputs, and defining metrics and calculation methodologies.
3. Discovery
It is important to distinguish between the role of altmetrics for the discovery of ideas and their role in evaluation. Currently we tend to view all metrics through the lens of researcher assessment, which limits development of altmetrics for other purposes. There are concerns that altmetrics focus too much on the author rather than on the reader. A potential action item to determine primary use cases and develop needs assessments based on these use cases.
4. Research Evaluation
Although concerns were raised about the potential misuse of altmetrics, new measures are clearly needed to acknowledge new research outputs and to measure impact beyond the academy. Engagement with the public is a required component of the Research Excellence Framework (REF) in the UK, and is increasing in importance elsewhere as well. Potential action items include clearly stating the role of altmetrics and identifying scenarios for their appropriate use in research evaluation.
5. Data Quality & Gaming
Reliability and validity of data is essential for establishing trust in altmetrics. Different numbers are commonly seen when comparing data between sources. The irregular use of persistent identifiers makes it challenging to derive consistent metrics on new types of research outputs. Community consensus is required to distinguish which types of activities are considered acceptable promotion and which should be seen as gaming the system. Six potential action items were identified: facilitating the use of identifiers, researching the reproducibility of metrics across different providers, developing strategies to normalize source data across different providers, exploring APIs and exchange formats for data gathering, consideration of open data, audits and a clearinghouse to increase trust, and defining activities which should be considered gaming of new metrics.
6. Grouping & Aggregation
Combining counts from multiple manifestations of a work (such as HTML and PDF) into a single score can improve understanding but comes with risks of introducing biases and errors. Potential action items are to address best practices for grouping and aggregating multiple data sources, grouping and aggregating data by journal, author, institution or funder, and defining contributorship roles.
7. Context
The lessons learned from the current use of the Impact Factor have raised concerns about expressing new metric data as a single number or score. The Impact Factor, for example, does not account for disciplinary and geographical influences, and makes comparing works by authors in different fields problematic. With social media, knowing things like who tweets can be more important than just a raw number of how many tweets were sent. The potential action item here is the establishment of a normalization strategy over time for these characteristics.
8. Stakeholders’ Perspectives
The many different stakeholder perspectives offered provide a rich context for understanding both the interest in new metrics and concerns that need to be addressed for altmetrics to become well established. Interest from faculty is affected by their career stage, though researchers are generally interested in recognition beyond the academy. Similarly, academic administrators expressed increased interest in impact assessment and visibility beyond their institution, as economic pressures emphasize the need to attract funders of research as well as students. Funding agencies realize the importance of the general public understanding the research being supported. Publishers are adopting altmetrics as part of a broader suite of measures. Social media captures interest by the general public in conversations from which they would previously have been excluded. Potential action items address the need to create use cases for different stakeholder groups and to identify contributor categories.
9. Adoption
Three activities were identified as having a higher priority: unique identifiers for scholarly works and contributors, standards for usage statistics in the form of views and downloads and the need for building infrastructure rather than detailed metric analysis. Outreach to faculty will be required and discipline related societies are viewed as a good channel to promote the use of altmetrics by researchers. Potential action items include identifying organizations for further discussions, identifying existing standards to be applied, prioritizing further activities, and clarifying researcher strategy.
Next Steps
The second phase of this Altmetrics Project will involve the creation of a new Work Item Proposal for approval by the NISO topic committee and the designation of a Working Group to develop consensus standards and best practices. To help things move forward, please do read the report and add your comments.
Discussion
14 Thoughts on "NISO Vets Research on Altmetrics"
I am not a publisher, but an active researcher who listens in to the discussions on The Scholarly Kitchen with great interest. One of the most striking things for me with the discussions about Altmetics is that I get the impression that it is almost totally unknown in the research community. I have tried to talk about it with colleagues, and I have so far not met a single one who knows anything about Altmetrics, and this includes senior officers at learned societies. I would think, that regardless of what one thinks about Altmetrics, it should be imperative to involve the scientific community to a much larger extent, as soon as possible. We scientists may not be able to affect the development, but it will definitely affect us.
Surveys have confirmed your observations that among stakeholders, researchers at this point are less familiar with altmetrics and perhaps less motivated due to their reliance on established metrics. As funders and provosts acknowledge the importance of newer metrics this will change over time. The other driver will be peers who find new metrics of value. The recent launch of the ORCID ID and COUNTER’s article level downloads are key parts of the infrastructure that are just being introduced now and will provide further support to altmetrics at a more granular level than previously possible.
Judy’s report summary is most helpful. The numerous mentions of the need to have research engage with a wider audience is noted and critical. For the researcher this may mean dissemination through other vehicles than academic journals (tweets mentioned here). It also means different styles and methods for communicating. The other point made here is that funders of research are considering the broader impact than academic journals and Journal Impact Factors. The audience, students, lay scholars and a larger public, require a different consciousness for the researcher when communicating
For the academic journal it raises numerous issues such as Open Access that transcends the current academic and professional audience, different publication “styles” and review of writing. This might also suggest the review/editorial process also. economics enter here, including other vehicles that may compete.
For institutions, it might impact on purchasing decisions and even the use of traditional journals for promotion and tenure.
It opens up the entire purpose of knowledge distribution, the access and the usage. In many ways it could significantly impact on the business model of academic journals.
>> There are concerns that altmetrics focus too much on the author rather than on the reader. A potential action item to determine primary use cases and develop needs assessments based on these use cases. <<
I'd argue that even this understanding of possible stakeholders is too narrow. Librarians are another big potential use case. They produce and steward heaps of digital scholarly content. Altmetrics can help them better understand the use of digital special collections, plan for collection development, etc.
If an academic publisher’s marketing department tweets about a new book, and that tweet sends twenty readers to Project Muse to look at that book, is that worth including in the altmetric statistics, or is that just marketing so something different? Would it change if the publisher linked to Amazon instead of Project Muse? Do the credentials of the twenty readers matter? What if half were in higher ed, and half were not in higher ed? If the author retweets her publisher’s tweet, and that sends another twenty readers to Project Muse to read her book, is that same tweet no longer marketing but something worth including in altmetrics? Do the credentials of the author’s followers/readers matter?
I love the idea of using altmetrics, but knowing what and when to count a click as a referral seems so tricky. From the perspective of a university press that like other university presses heavily uses social media to market and promote, I’ve seen publishers use social tools far more extensively than almost any author, and these are the kinds of questions I wonder about when thinking about just what exactly are altmetrics measuring.
Despite the bow to “research outputs,” altmetrics is really a concern, indeed an obsession, for people involved with journal publishing. Do publishers of scholarly books need to be interested at all? After all, books, unlike journal articles, routinely get reviewed by experts in scholarly journals, and this system of measuring value seems to have worked well for hundreds of years. Is anyone unhappy with it? Did the study even bother to address it?
One of the issues I had as a biology books editor was that for many researchers, book chapters didn’t “count” as much toward career advancement as journal articles. One could write an identical review article and publish it in a book and get little credit, or publish it in a journal that was MedLine Indexed and had an Impact Factor, and get career credit. Often, my requests for chapters would reluctantly be turned down. If there are ways to get scholarly outputs like this better recognized, then that would indeed benefit book publishers.
Point well taken, David, although for any really thorough review of an edited volume, there are usually comments about each chapter as well as an overall assessment.
I’m really glad to see a neutral party like NISO stepping in and driving the agenda here rather than relying on for-profit companies who might place their own interests over those of the research community (though their input is certainly an important part of the process).
I worry though, that the really thorny issues are more conceptual and likely can’t be solved through setting up standards and action items. What’s really needed are better measurements (and perhaps definitions) of “impact”, both in terms of research quality and societal impact. Much of what is offered from altmetrics, at least to date, centers around attention metrics. Attention is certainly worth measuring and can provide valuable insights, but it says little about quality of work nor real world improvements that have resulted from the work.
I am concerned that altmetrics are being defined by a few prominent commercial ventures, when the academic bibliometrics community is also doing a great deal of research on this topic. I recently saw a paper that looked at 106 different article level metrics, none of which are being flogged commercially. Altmetrics is a major research field, not just a product line. I do not see a lot of evidence that NISO is looking at this robust scientific field.
Today’s blog post is an excellent example of the importance of altmetrics. This is without qualification this blog entry while not a traditionally measured contribution is quite simply the best summary of the issues, concerns, rational and, opportunities surrounding altmetrics I have read.