Every June, publishers and editors anxiously await the Parade of the Journal Citation Reports, and hope — some even pray — that their own numbers will increase.
After all, an increase is an increase, right?
In spite of a plea from one liblicense-l reader, it didn’t take long for publishers to begin crunching 2009 performance figures to see how well they did. And from several press releases, many publishers did very well, for example:
- “Elsevier saw over 65% of its journal Impact Factors (IFs) increase from 2008 to 2009.”
- “Wiley-Blackwell’s average Impact Factor saw a 5.3% rise last year. This year, the company’s average was 2.1, compared to 2.0 last year.”
- “Nature’s 2009 Impact Factor is its highest ever (34.480)“
So when is an increase not always beneficial?
Writing in the June issue of BioScience , Bryan Neff and Julian Olden, two biologists, argue that journal impact factor increases must be viewed against general background inflation. They write:
. . . much in the same way a modest salary raise effectively means very little in an increasingly costly economy, an increase in a journal’s impact factor must be interpreted with respect to background levels of inflation.
In examining 70 journals in ecology, they found that almost 50% showed increases in their impact factors, but at rates lower than the background inflation rate. In other words, they were failing to keep pace with inflation.
Exploring the principle causes of impact factor inflation, Neff and Olden discovered that ecology papers are citing more papers — seven more, on average, in 2007 than a decade earlier. More importantly, proportionately more of these citations are to recent papers — papers eligible to count towards a journal’s impact factor calculation.
Reporting last year on a much larger sample (the entire list of journals in ISI’s Journal Citation Reports, or 4,300 titles), Althouse et al.  reported that 80% of the journals listed in the JCR showed an impact factor increase between 1994 and 2005, growing by an average rate of 2.6% per annum. Indeed, all subject disciplines revealed inflation with the exception of two: History, and History and Philosophy of Science, both of which showed negative growth.
Like Neff and Olden, Althouse reports that the chief explanation for general inflation is that reference lists have continued to get longer, although cross-disciplinary differences in impact factors could be partially explained by the proportion of references citing papers published within the last two years.
In 1969, Nature’s first impact factor was 2.343 and only 10 journals had an impact score of 10 or over . Given the compound nature of inflation, it shouldn’t take too long to see triple-digit figures.
 Bryan D. Neff and Julian D. Olden (2010). Not So Fast: Inflation in Impact Factors Contributes to Apparent Improvements in Journal Quality. BioScience 60(6):455-459. http://caliber.ucpress.net/doi/abs/10.1525/bio.2010.60.6.9
 Althouse, B. M., West, J. D., Bergstrom, C. T., & Bergstrom, T. 2009. Differences in impact factor across fields and over time. Journal of the American Society for Information Science and Technology 60: 27-34, http://dx.doi.org/10.1002/asi.20936
 Garfield, E. 1972. Citation analysis as a tool in journal evaluation. Science 178: 471-479, http://dx.doi.org/10.1126/science.178.4060.471
5 Thoughts on "Impact Factor Inflation: When an Increase is Actually a Decrease"
According to Thomson Reuters, the 2008 JCR included > 8000 journals, the 2009 JCR more thqan 9100. According to the Scientist , http://www.the-scientist.com/blog/display/57500/ , “more than 4700 titles showed an increase over their 2008 impact factors.”
Thomson Reuter’s JCR is often misspelled as “Journal of Citation Reports”. But now there’s an alternative: Nicholas Knouf’s Journal of Journal Performance Studies (JJPS), http://turbulence.org/Works/JJPS/ , an “artistic intervention into the scholarly communication system” which features a Firefox Extension (“for all your journal performance needs”), an Internet Radio, and a scholarly journal. The Firefox Extension, currently still in Beta, offers “standard values such as SJR and the H-Index, as well as our own special ones such as the Frobpact Factor ™, Eigenfrob Factor ™, Frobfluence ™, and new! The Click Value! We’re always developing new ways to measure journal performance, so stay tuned!”
The real fun is that users surfing with the extension on the publisher’s websites, get informed on what any journal costs their institution and get real time updates on stock headlines and quotes.
I agree that part of the explanation for IF-inflation (“InFlation?” No.) is indeed the growth in the number and freshness of citations. This growth is itself rather interesting, and is no doubt fueled at least in part by the increasing power of academic search engines–it’s increasingly easier to discover good, relevant literature from a keyword search.
But I’m surprised the article doesn’t mention another likely cause for the inflation: publishers are getting better at working the system with review articles, citing in “front matter” and other tricks (for a nice review of these, see ). One of the many downsides to a monolithic metric like the IF is that people start to get pretty good at gaming it.
 Falagas & Alexiou, 2008. The top-ten in journal impact factor manipulation, Archivum Immunologiae et Therapiae Experimentalis. 56(4). http://www.ncbi.nlm.nih.gov/pubmed/18661263
University of California Press has made the cited BioScience article freely available for the next 30 days. Follow the link to CrossRef in the blog post and select ‘Caliber’ to view the article.