Metrics and Analytics

2014 Journal Impact Factors

JCR visualization

JCR visualization

Later today (June 18, 2015), the 2014 edition of the Journal Citation Report (JCR) will be released, listing citation performance metrics for 11,149 journals. While the JCR calculates many different citation-based metrics, most editors and publishers will be chiefly interested in just one single metric–the Journal Impact Factor (JIF).

The JIF measures, for any given year, the citation performance of journal articles in their second and third year of publication. Despite regular attacks on the use of this metric for evaluation purposes (viz. DORA), JIFs are considered a crucial factor for where scientists choose to submit their manuscripts. A high initial JIF can result in a flood of new manuscripts (e.g. PLOS ONE); conversely, a steady JIF decline can signal that scientists should submit their best papers elsewhere. For new journals without a JIF, encouraging submissions can be a real challenge.

This year, 272 journals will receive their first Impact Factor. The JCR will also suppress 39 titles –29 for high rates of self-citation and 10 for “citation stacking,” a behavior that resembles a citation cartel. Suppression from the JCR lasts one year and requires reevaluation before a journal is relisted. Fifty-three percent of journals will receive an increase in their Impact Factor from last year.

While most editors and publishers will be primarily interested in their Journal Impact Factor, the JCR includes other citation-based metrics: an Impact Factor based on a five-year observation window, the Citing and Cited Half-Life, and two network based metrics–Eigenfactor and Article Influence Score–both of which weight citations based on the importance of journals within the citation network.

This year, the JCR is adding two complementary calculations so that journals can be compared within and between subject disciplines. The JIF Percentile simply translates a journal’s category rank into a percentile. For example, a journal that is ranked 19 out of 291 Biochemistry & Molecular Biology journals would receive a JIF Percentile score of 0.94. Curiously, using the JCR normalization method,* a journal ranked first among seven Andrology journals would only receive a score of 0.93.

The second new metric, Normalized Eigenfactor (NE) Score converts a journal’s Eigenfactor into a multiplicative score centered around 1, such that, if a journal received an NE score of 2, it would be twice as influential as the average journal in the network. It should be noted that Eigen-based metrics (unlike the Impact Factor) require iterative computation upon the entire citation dataset, and there is no standard for how that calculation is done. Scopus, a competing citation dataset produced by Elsevier, has developed its own calculation methods that some believe are superior to the JCR.

Thomson Reuters has made several improvements to its InCites interface, which debuted last year and has been the target of complaints by both new and regular users. The JCR visualization–an interactive widget that seems to serve little function other than to titillate the bored and bewilder the perplexed–is now hidden by default. The interface now permits a user to download the data behind many of their tables and charts, and the company is also in the process of expanding its advisory group to help with future usability issues.

In sum, this year’s release illustrates some practical and useful improvements and a growing willingness to listen to the feedback from their stakeholders.

* JIF Percentile is calculated as (n – r + .5)/n where n = number of journals in the category and r = descending rank of the journal within that category.

About Phil Davis

I am an independent researcher and publishing consultant specializing in the statistical analysis of citation, readership and survey data. I am a former postdoctoral researcher in science communication and former science librarian.


24 thoughts on “2014 Journal Impact Factors

  1. Hi Phil, please note that the 2014 journal metrics (SNIP, SJR and IPP) for all titles indexed in Scopus are available to download at: The values for more than 30 thousand titles going back to 1999 are all available for free. For more information, see our blog post:

    Posted by Wim Meester | Jun 18, 2015, 6:01 am
  2. Phil, your list of suppressed journals (and total of 39) currently links to the 2013 JCR. I realise that page will be updated later today, but the total of 39 refers to last year and will likely change. (Unless you know that it won’t!)

    Posted by Martyn Lawrence | Jun 18, 2015, 6:17 am
    • I was provided an embargoed list of suppressions for this year (2014): 39 total suppressions; 29 for self-citation and 10 for “citation stacking”. I am told the list will change sometime today.

      Posted by Phil Davis | Jun 18, 2015, 6:20 am
  3. One question I am often asked is what the average increase is for the entire index. For example, if I look at all of my IFs and say that they average out to a 20% increase over the previous year, I will be asked whether, on average, the entire index represents a 20% increase and we are just riding the wave.

    Phil, oh wise one with stats and IF, do you have a sense of that?

    Posted by Angela Cochran | Jun 18, 2015, 9:08 am
  4. Thanks for this chock full post on 2014 IFs and related stats.

    The fact that 53% of journals received an increase in their IF is interesting. Is that typical of most years? I’ve heard the opinion that there is natural “grade inflation” going on in IFs (because lists of citations are getting longer in newer articles?), but I haven’t seen anyone document that, or any other case made for grade inflation.

    [I read the JIF Percentile paragraph while eating peanut butter for breakfast. So I’ve come up with a formula for comparing crunchiness vs. creaminess…]

    John Sack, HighWire Press

    Posted by John Sack | Jun 18, 2015, 9:17 am
  5. John, I’ve heard the same thing about IF inflation, but I thought it existed because of the addition of new journal titles into JCR every year; more of what’s already being cited is then counted toward impact factors when the journals being cited are added to the JCR? A better measure, of course, is the ranking.

    Posted by Suzanne Kettley, Canadian Science Publishing | Jun 18, 2015, 11:34 am
  6. I’m wondering why our publication (Physics Today) has no metrics. Could it be that we’re classed as a trade journal despite having peer-reviewed articles and DOI’s deposited?

    Anyway, I’ve asked for clarification.

    Posted by starbird2005 | Jun 18, 2015, 2:02 pm
  7. Hi, could you tell me where you got this information?
    I could not find relevant information of 2014 Journal Impact Factors online.
    Thank you!

    Posted by Xin Xu | Jun 18, 2015, 4:30 pm
  8. My daily newspaper, the Ottawa Citizen, got a company in India to offer us an impact factor of 4.3 for $40. Wonder if we’ll show up in the 2014 JCR list. Our story ran here:

    Posted by Tom Spears, Ottawa Citizen | Jun 19, 2015, 1:35 pm
    • I have to wonder why anyone would pay a fake company to get a fake Impact Factor. If your journal is so bogus that you’d need to do this, why not just fake the Impact Factor and save the money?

      Posted by David Crotty | Jun 19, 2015, 4:26 pm
  9. Where can I download the 2013 and 2014 JIF lists/JCR reports in either excel or csv format? I have done my best to look at the Thomas Reuters and Web of Knowledge websites but can only find information about the JCR and what it is, not a downloadable list. I can also find downloadable lists on other websites but am not confident that those are authoritative.

    Posted by Juniper May | Jun 24, 2015, 9:57 pm


  1. Pingback: A Look Inside the 2014 Journal Citation Reports: Journal Impact Factor | LJ INFOdocket - Jun 18, 2015

  2. Pingback: 2014 Journal Impact Factors | belennovoagarcia - Jun 19, 2015

  3. Pingback: Journal Citation Report (JCR) 2014 | ∫InCEC Blog - Jun 19, 2015

  4. Pingback: ISI: Reviste românești eliminate | Isarlâk - Jun 19, 2015

  5. Pingback: Impact factor: pubblicata l’edizione 2014 | PuntoMedLibrary - Jun 19, 2015

  6. Pingback: New 2014 Journal Impact Factors – Stephen's Lighthouse - Jun 19, 2015

  7. Pingback: Weekend reads: Duplication rampant in cancer research?; meet the data detective; journals behaving badly - Retraction Watch at Retraction Watch - Jun 20, 2015

  8. Pingback: Finding Impact Factors: Journal Citation Reports latest issue | RESEARCH NEWS from Swansea University Library - Jun 22, 2015

  9. Pingback: Qui sont les raisonnables ? | Dernières nouvelles du front - Jun 28, 2015

  10. Pingback: About OA19, Open Access, Copyright and Open Science – This Caught Our attention | Stockholm University Press Blog - Jul 8, 2015

The Scholarly Kitchen on Twitter

Find Posts by Category

Find Posts by Date

June 2015
« May   Jul »
The mission of the Society for Scholarly Publishing (SSP) is "[t]o advance scholarly publishing and communication, and the professional development of its members through education, collaboration, and networking." SSP established The Scholarly Kitchen blog in February 2008 to keep SSP members and interested parties aware of new developments in publishing.
The Scholarly Kitchen is a moderated and independent blog. Opinions on The Scholarly Kitchen are those of the authors. They are not necessarily those held by the Society for Scholarly Publishing nor by their respective employers.
%d bloggers like this: