Authority, Education, Metrics and Analytics, Peer Review, Research, Social Role, Tools

The “h-index”: An Objective Mismeasure?

Recently, I was surprised to find myself in the midst of some academic administrators who were comparing notes about how they use and rely upon the “h-index.” The goal of the h-index is to quantify a researcher’s apparent contribution to the literature, rewarding diverse contributions more than a single blockbuster contribution.

I was under the impression that the h-index was an obscure measure that hadn’t hit the mainstream yet. I was wrong. It’s not incredibly prevalent, but it is being used, and if you’re an academic, your tenure track may have the h-index on it.

But there is a vital question about the h-index:

How fair is it to have your academic contributions boiled down to a single number?

A recent paper from the Joint Committee on Quantitative Assessment of Research discusses the potential pitfalls with over-reliance on quantitative measures like impact factors and the h-index.

I found the paper refreshingly frank. It’s from mathematicians, and they caution:

. . . it is sometimes fashionable to assert a mystical belief that numerical measurements are superior to other forms of understanding.

Amen. Judgment and wisdom are undervalued in many places these days, and this “mystical belief” in quantitative measures often subverts both.

Interestingly, the h-index has offshoots: the “m-index” (in which the h-index is divided by the number of years since the academic’s first paper, to make it easier for young faculty to compete) and the “g-index” (which smooths away the biggest papers).

The validity of measures like the impact factor, the Eigenfactor, the h-index, and others has been assessed using “convergent validity,” or how well these correlate to each other. As the authors put it:

This correlation is unremarkable, since all these variables are functions of the same basic phenomenon — publications.

Circular logic, indeed.

Papers are cited for all kinds of reasons, most of them rhetorical. In fact, even non-rhetorical citations have been found to have a number of motivations beyond positive intellectual debt. Yet, most people utilizing simple measures like impact factors and things like the h-index are unaware of the pitfalls and nuances.

These measures are all backward-facing, as well. The past is often a poor predictor of the future. That’s another problem with these.

However, if we were to accept that these measures are inherently sloppy and subjective even if they look clean and objective, reflect reputation and “halo effect” as much as academic contribution, and need to be viewed as proxies about the past, they might just work.

As a famous scientist and mathematician once said, “Everything should be made as simple as possible, but not simpler.”

About Kent Anderson

I am the CEO/Publisher of the STRIATUS/JBJS, Inc., the home of the Journal of Bone & Joint Surgery, JBJS Case Connector, JBJS Reviews, JBJS Essential Surgical Techniques, the JBJS Recertification Course, PRE-val, and SocialCite. Prior to this, I was an executive at the New England Journal of Medicine. I also was Director of Medical Journals at the American Academy of Pediatrics.

Discussion

3 thoughts on “The “h-index”: An Objective Mismeasure?

  1. Henry Small also promoted the idea that citations function as *concept symbols*, essentially a shorthand for an idea expressed by another author.

    For instance, if I cite Watson and Crick (Nature, 1953), most scientists will know that I’m referring to the concept of the double-helix of DNA, and I don’t need to go any further to describe the work.

    By analyzing the words around a citation, it is possible to create a collective interpretation of what that document stands for. The act of authorship and citation-making can therefore be viewed as a communal dialog.

    Small, H. 1978. Cited Documents as Concept Symbols. Social Studies of Science 8: 327-340.
    DOI: 10.1177/030631277800800305

    Posted by Philip Davis | Jul 1, 2008, 11:59 am

Trackbacks/Pingbacks

  1. Pingback: Eigenfactor « The Scholarly Kitchen - Nov 24, 2008

  2. Pingback: InCites — More Counting, But Does It Count? « The Scholarly Kitchen - Mar 25, 2009

Side Dishes by Stewart Wills

Find Posts by Category

Find Posts by Date

June 2008
S M T W T F S
« May   Jul »
1234567
891011121314
15161718192021
22232425262728
2930  

The Scholarly Kitchen on Twitter

SSP_LOGO
The mission of the Society for Scholarly Publishing (SSP) is "[t]o advance scholarly publishing and communication, and the professional development of its members through education, collaboration, and networking." SSP established The Scholarly Kitchen blog in February 2008 to keep SSP members and interested parties aware of new developments in publishing.
......................................
The Scholarly Kitchen is a moderated and independent blog. Opinions on The Scholarly Kitchen are those of the authors. They are not necessarily those held by the Society for Scholarly Publishing nor by their respective employers.
Follow

Get every new post delivered to your Inbox.

Join 13,057 other followers

%d bloggers like this: