Authority, Controversial Topics, Metrics and Analytics, Research, Sociology

Universal Citation Paper Lacks Universality

My Hero, Zero

My Hero, Zero (Schoolhouse Rock, 1973)

What constitutes a high number of article citations? It often depends on the field.

100 citations to a paper in cancer biology — a field dense with researchers, huge grants, short papers, and fast publication times — may not necessarily have the same impact as an economics paper with the same number of citations.

For this reason, comparing citation impact across disciplines is widely considered verboten.

In 2008, three Italian researchers tried to change that. In a paper published in PNAS, Filippo Radicchi, Santo Fortunato, and Claudio Castellano argued that it was possible to normalize citations across disparate disciplines so that the performance of papers published in different fields could be compared. Their technique was very simple — divide the number of citations to a paper by the average number of citations to all papers in its discipline for that year. As I reported in 2008, this simple transformation appeared to line citation distributions up like ducks in a row. With a universal citation metric, it would be far easier to evaluate the relative impact a paper, and its authors, had on science.

Such a discovery was big, bold, and beautiful. But like other novel and important claims made in science, it didn’t take long for other authors to attempt to validate the claim.

In a paper published in the January issue of the Journal of the American Society for Information Science and Technology, three researchers at the University of Leiden (Ludo Waltman, Nees Jan van Eck, and Anthony van Raan) dispute the universality of citations.

In their paper, titled Universality of citation distributions revisited,” Waltman and others expanded their analysis from 20 disciplines, reported in the original Radicchi paper, to all 221 disciplines in the sciences and social sciences, as classified by Thomson Reuter’s Web of Science. After counting the citations to each paper after ten years, Waltman applied Radicchi’s normalization technique. Did the citation distributions for every field indeed line up? Not exactly. Waltman writes:

[M]any fields of science indeed seem to have fairly similar citation distributions. However, there are quite some exceptions as well. Especially fields with a relatively low average number of citations per publication, as can be found in the engineering sciences, the materials sciences, and the social sciences, seem to have nonuniversal citation distributions.

Apart from the limited sample in the original sample, Waltman describes two oddities with the Radicchi paper that may explain their divergent results:

First, Radicchi included research articles and letters in this analysis. Waltman finds this mix curious since research papers have a very different citation profile from letters.

Second, and more importantly, Radicchi excludes all uncited papers from his analysis. As many papers remain uncited over the years, Waltman questions why these were excluded from Radicchi’s analysis since they are valid observations. The Radicchi paper does not explain why uncited papers were excluded from his analysis, only that including them does not change their results — a claim that Waltman disputes by showing how the exclusion of uncited papers changes the distribution of citations, making the universality claim “more justifiable.”

Zero is a very important number in scholarly communication. For many indicators of scholarly impact (citations, comments, blogs, and tweets), zero is the most frequent number encountered. It forms the anchor of a long tailed, skewed distribution in which most attention is lavished on a coveted few. Remove zero, and you’ve changed the very nature of that distribution.

To quote Schoolhouse Rocks,”Zero is a wonderful thing. In fact, Zero is my hero.”

About Phil Davis

I am an independent researcher and publishing consultant specializing in the statistical analysis of citation, readership and survey data. I am a former postdoctoral researcher in science communication and former science librarian.


4 thoughts on “Universal Citation Paper Lacks Universality

  1. As an Editor-in-Chief, I always look at which articles have received zero cites. Basically, they represent a failure to attract interest, a lesson which can be applied to future selections. (It’s not a strict criterion, of course; sometimes you have to take a chance.) Not considering the zero cites seems like a poor way to compare journals.

    Posted by Ken Lanfear | Apr 24, 2012, 10:15 am
  2. Let me make a small addition to Phil’s story. Recently, a new paper by Radicchi and Castellano was published in PLoS ONE ( This paper confirms that universality of citation distributions indeed does not hold generally. The authors propose an interesting alternative approach for making proper comparisons of citations in different fields of science.

    Ludo Waltman

    Posted by Ludo Waltman | Apr 24, 2012, 10:21 am
    • I find that the authors are playing with the word “universality.” In their latest PLoSONE paper, they use phrases like “universal properties,” which is admitting that their transformation does not hold for all fields, that variation exists among fields, and that there are exceptions.

      The transformation is almost linear for the majority of the subject-categories. Exceptions to this rule are present, but, in general, we find that all citation distributions are part the same family of univariate distributions. In particular, the rescaling considered in [our PNAS paper], despite not strictly correct, is a very good approximation of the transformation able to make citation counts not depending on the scientific domain.

      While near-universal findings are commonly reported in much of the scientific literature, we are dealing with the construction of a tool to evaluate the impact of science and the career trajectory of scientists. For this reason, creating a simple metric that is a “good approximation” may not be good enough.

      Posted by Phil Davis | Apr 24, 2012, 10:37 am


  1. Pingback: Ninth Level Ireland » Blog Archive » Universal Citation Paper Lacks Universality - Apr 24, 2012

The Scholarly Kitchen on Twitter

Find Posts by Category

Find Posts by Date

April 2012
« Mar   May »
The mission of the Society for Scholarly Publishing (SSP) is "[t]o advance scholarly publishing and communication, and the professional development of its members through education, collaboration, and networking." SSP established The Scholarly Kitchen blog in February 2008 to keep SSP members and interested parties aware of new developments in publishing.
The Scholarly Kitchen is a moderated and independent blog. Opinions on The Scholarly Kitchen are those of the authors. They are not necessarily those held by the Society for Scholarly Publishing nor by their respective employers.
%d bloggers like this: