The National Information Standards Organization (NISO) has recently released the results of their year long study of Altmetrics in draft form for comment. This is the first phase of a two part grant funded by the Alfred P. Sloan Foundation designed to identify and develop best practices and standards that can support the widespread adoption of new metrics. A series of meetings were held in San Francisco, Washington, DC and Philadelphia and these were followed by ~30 interviews with engaged publishers, researchers, provosts, librarians and funders to discuss the value of altmetrics to each stakeholder group and the issues that need to be addressed.
As an attendee at the Philadelphia meeting and one of the interviewers for this project, it became clear that these new metrics can encourage researchers to consider the value of their work to those outside the academy. Established metrics tend to focus on activities within the academy – citation metrics represent author behavior and downloads represent reader behavior. Newer metrics are beginning to tell us about how research is engaging the general public, something that is growing in importance to funders.
The draft NISO Altmetrics Standards Project White Paper provides a state-of-the-art summary of the current state of altmetrics and is organized into nine key topics where consensus on action items would support further development. 25 potential action items are distributed across the nine topics below.
Understanding alternative assessment metrics begins with the terms used to describe them. How we define and use these terms will influence the adoption of the metrics they describe. Some felt the term ‘altmetrics’ was limiting and that they should be seen as complementary, rather than as providing a replacement. Potential action items suggested focus on refining specific definitions, proper use of terms and possible subcategories as needed.
2. Research Outputs
There is a need to recognize the value of different forms of scholarly output beyond the research paper, including outputs like datasets, software, posters, theses, performances, etc. Work has been done by a few groups in Europe to better define the many types of scholarly artifacts. However, there are questions about whether all of these outputs should be tracked and measured, and whether some should be seen as more important than others. Potential action items include identifying outputs that are applicable to the use of metrics, defining relationships between the outputs, and defining metrics and calculation methodologies.
It is important to distinguish between the role of altmetrics for the discovery of ideas and their role in evaluation. Currently we tend to view all metrics through the lens of researcher assessment, which limits development of altmetrics for other purposes. There are concerns that altmetrics focus too much on the author rather than on the reader. A potential action item to determine primary use cases and develop needs assessments based on these use cases.
4. Research Evaluation
Although concerns were raised about the potential misuse of altmetrics, new measures are clearly needed to acknowledge new research outputs and to measure impact beyond the academy. Engagement with the public is a required component of the Research Excellence Framework (REF) in the UK, and is increasing in importance elsewhere as well. Potential action items include clearly stating the role of altmetrics and identifying scenarios for their appropriate use in research evaluation.
5. Data Quality & Gaming
Reliability and validity of data is essential for establishing trust in altmetrics. Different numbers are commonly seen when comparing data between sources. The irregular use of persistent identifiers makes it challenging to derive consistent metrics on new types of research outputs. Community consensus is required to distinguish which types of activities are considered acceptable promotion and which should be seen as gaming the system. Six potential action items were identified: facilitating the use of identifiers, researching the reproducibility of metrics across different providers, developing strategies to normalize source data across different providers, exploring APIs and exchange formats for data gathering, consideration of open data, audits and a clearinghouse to increase trust, and defining activities which should be considered gaming of new metrics.
6. Grouping & Aggregation
Combining counts from multiple manifestations of a work (such as HTML and PDF) into a single score can improve understanding but comes with risks of introducing biases and errors. Potential action items are to address best practices for grouping and aggregating multiple data sources, grouping and aggregating data by journal, author, institution or funder, and defining contributorship roles.
The lessons learned from the current use of the Impact Factor have raised concerns about expressing new metric data as a single number or score. The Impact Factor, for example, does not account for disciplinary and geographical influences, and makes comparing works by authors in different fields problematic. With social media, knowing things like who tweets can be more important than just a raw number of how many tweets were sent. The potential action item here is the establishment of a normalization strategy over time for these characteristics.
8. Stakeholders’ Perspectives
The many different stakeholder perspectives offered provide a rich context for understanding both the interest in new metrics and concerns that need to be addressed for altmetrics to become well established. Interest from faculty is affected by their career stage, though researchers are generally interested in recognition beyond the academy. Similarly, academic administrators expressed increased interest in impact assessment and visibility beyond their institution, as economic pressures emphasize the need to attract funders of research as well as students. Funding agencies realize the importance of the general public understanding the research being supported. Publishers are adopting altmetrics as part of a broader suite of measures. Social media captures interest by the general public in conversations from which they would previously have been excluded. Potential action items address the need to create use cases for different stakeholder groups and to identify contributor categories.
Three activities were identified as having a higher priority: unique identifiers for scholarly works and contributors, standards for usage statistics in the form of views and downloads and the need for building infrastructure rather than detailed metric analysis. Outreach to faculty will be required and discipline related societies are viewed as a good channel to promote the use of altmetrics by researchers. Potential action items include identifying organizations for further discussions, identifying existing standards to be applied, prioritizing further activities, and clarifying researcher strategy.
The second phase of this Altmetrics Project will involve the creation of a new Work Item Proposal for approval by the NISO topic committee and the designation of a Working Group to develop consensus standards and best practices. To help things move forward, please do read the report and add your comments.