Every June, publishers and editors anxiously await the Parade of the Journal Citation Reports, and hope — some even pray — that their own numbers will increase.
After all, an increase is an increase, right?
In spite of a plea from one liblicense-l reader, it didn’t take long for publishers to begin crunching 2009 performance figures to see how well they did. And from several press releases, many publishers did very well, for example:
- “Elsevier saw over 65% of its journal Impact Factors (IFs) increase from 2008 to 2009.”
- “Wiley-Blackwell’s average Impact Factor saw a 5.3% rise last year. This year, the company’s average was 2.1, compared to 2.0 last year.”
- “Nature’s 2009 Impact Factor is its highest ever (34.480)“
So when is an increase not always beneficial?
Writing in the June issue of BioScience , Bryan Neff and Julian Olden, two biologists, argue that journal impact factor increases must be viewed against general background inflation. They write:
. . . much in the same way a modest salary raise effectively means very little in an increasingly costly economy, an increase in a journal’s impact factor must be interpreted with respect to background levels of inflation.
In examining 70 journals in ecology, they found that almost 50% showed increases in their impact factors, but at rates lower than the background inflation rate. In other words, they were failing to keep pace with inflation.
Exploring the principle causes of impact factor inflation, Neff and Olden discovered that ecology papers are citing more papers — seven more, on average, in 2007 than a decade earlier. More importantly, proportionately more of these citations are to recent papers — papers eligible to count towards a journal’s impact factor calculation.
Reporting last year on a much larger sample (the entire list of journals in ISI’s Althouse et al.  reported that 80% of the journals listed in the JCR showed an impact factor increase between 1994 and 2005, growing by an average rate of 2.6% per annum. Indeed, all subject disciplines revealed inflation with the exception of two: History, and History and Philosophy of Science, both of which showed negative growth., or 4,300 titles),
Like Neff and Olden, Althouse reports that the chief explanation for general inflation is that reference lists have continued to get longer, although cross-disciplinary differences in impact factors could be partially explained by the proportion of references citing papers published within the last two years.
In 1969, Nature’s first impact factor was 2.343 and only 10 journals had an impact score of 10 or over . Given the compound nature of inflation, it shouldn’t take too long to see triple-digit figures.
 Bryan D. Neff and Julian D. Olden (2010). Not So Fast: Inflation in Impact Factors Contributes to Apparent Improvements in Journal Quality. BioScience 60(6):455-459. http://caliber.ucpress.net/doi/abs/10.1525/bio.2010.60.6.9
 Althouse, B. M., West, J. D., Bergstrom, C. T., & Bergstrom, T. 2009. Differences in impact factor across fields and over time. Journal of the American Society for Information Science and Technology 60: 27-34, http://dx.doi.org/10.1002/asi.20936
 Garfield, E. 1972. Citation analysis as a tool in journal evaluation. Science 178: 471-479, http://dx.doi.org/10.1126/science.178.4060.471