Sitting through multiple presentations at last week’s STM Innovations Seminar in London, which focused on metrics and altmetrics, the audience was exposed to the word “impact” in many contexts — referring to the impact factor, usage factors, social media awareness, downloads, and various somewhat acceptable and more less acceptable attempts at meaningful correlations.
I started to notice how drained the term “impact” had become by the end of the day. It has a specific meaning in scholarly publishing, but one that is being blurred. “Impact” is such an attention-grabbing word, well, you get more impact if you use it, even if it’s a stretch. Of course, the wear and tear on “impact” was followed by similar trafficking of the term “factor,” which was trotted out nearly as frequently.
But is the habit of using “impact” or “factor” a sign that alt-metrics is a bit too beholden to trying to recreate the impact factor?
Originally, as one attendee noted, Eugene Garfield developed the impact factor to cut through the clutter of less objective and transparent attempts to assert authority and prestige. “Impact” had a specific meaning — impact on subsequent thinking of working scientists, realized through active citation in the literature.
This narrow definition is worth preserving. We need a word that touches on an endpoint that is demonstrable and intellectual, not speculative and work-based.
Choosing the right term is important to getting to the right idea. If we stay stuck in the land of “impact,” we remain attached to established ideas, even subconsciously, which inhibits novel ideas.
For instance, awareness via social media does not strike me as impact. It is awareness. Awareness alone is not going to — or at least should not — generate intellectual outputs. It’s too shallow. It’s not unimportant, however, but we should call it what it is. Awareness is correlated with usage, which is not surprising, but worth noting. When is the “awareness” idea exhausted? Pretty quickly, I fear.
Usage is not impact. Current usage measures have many limitations, including no integration of print usage, no integration of pass-along usage, and no information about the strength of engagement (e.g., glance, read, distribute, save). Usage is important to note, but our current usage measures are about counting transactions, and they do not reflect any measure of intellectual output or long-term votes of value through actions like citation.
The impact factor has problems, but it has endured, and continues to provide some good information in an interesting and useful context. I hope new metrics emerge that help shed some light on the value of information’s journey through science. But if we continue to refer to every potential new measure as having “impact,” we’re showing an unnecessary fealty to the impact factor, an uncertainty about the independent value of our new ideas, and a lack of precision about exactly what new metrics are measuring.
Conflating new ideas with established ideas isn’t helping us think clearly or find ways to leapfrog where we are. Mental habits are a major factor in creativity — you have to break habits to think differently. And it’s hard to have an impact when everything is “impact.”