Note: Clarivate provided responses to my questions that were unfortunately held up in transit and captured as spam. Please see their responses at the end of this post.

Clarivate Analytics announced today that they are granting all journals in the Web of Science Core Collection an Impact Factor with the 2023 release.

The Web of Science Core Collection has over 21,000 journals and includes:

  • Science Citation Index Expanded (SCIE)*
  • Social Sciences Citation Index (SSCI)*
  • Arts and Humanities Citation Index (AHCI)
  • Emerging Sources Citation Index (ESCI)

Those journals in the indices with an asterisk currently get an Impact Factor. Starting with the 2023 release (the 2022 Impact Factors), journals in the Arts and Humanities Index and the Emerging Sources Citation Index will also get Impact Factors. Journals with an Impact Factor are included in the Journal Citation Report (JCR).

Old relief artwork depicting souls being purified in purgatory

About the Emerging Sources Citation Index (ESCI)

In 2015, Clarivate launched the ESCI. It was initially described as an index of journals that are up-and-coming — meaning new journals, or established journals in niche areas that are growing in impact. At the time of launch, publishers were told that a journal selected for ESCI will likely get an Impact Factor within a few years.

The model for ESCI seemed to shift a few years later and there are many journals in ESCI that have been there since 2015 that still don’t have Impact Factors. In fact, Clarivate includes content for indexed journals back to 2005 so there clearly were journals older than 10 years in the database when it launched.

Clarivate reports that ESCI has over 7800 journals with 3 million records. A little over a third of those records are open access records.

The inclusion criteria for all four indices include 24 quality measures and four “impact” measures. Those journals that meet all 28 criteria are included in SCIE, SSCI, and AHCI. Those that only meet the 24 quality measures were relegated to the ESCI.

This change announced today indicates that the four impact measures are no longer required in order to get an Impact Factor.

About the Four “Impact” Measures

The 24 quality measures for inclusion in the Web of Science Core Collection includes everything you would expect — an ISSN, a peer-review policy, timeliness of publication, author and editorial board affiliations, ethics statement, and consistency between the content and citations to the stated aims and scope of the journal.

Again, any journal that passes all 24 of these criteria is likely to be included in the Web of Science Core Collection. I say “likely” because there is an editorial board of actual humans that makes the decision.

The four impact measures include:

  • Comparative Citation Analysis: Number and sources of citations — are good journals citing the work in candidate journals and is this sustainable over some period of time.
  • Author Citation Analysis: Most authors published in the candidate journal should have a “discernable” publication history in other indexed journals.
  • Editorial Board Citation Analysis: Most editorial board members listed in the candidate journal should have a “discernable” publication history in other indexed journals.
  • Content Significance: Content should be “of interest, importance, and value to its intended readership and Web of Science subscribers.”

Candidate journals that meet the 24 quality measures, but do not meet the four impact measures are likely to be in the ESCI only.

What Benefit Does the ESCI Bring?

Clarivate’s Web of Science is in hot competition with Elsevier’s Scopus database. Scopus has traditionally had a more inclusive take on admitting journals, giving it the advantage of having more content than Web of Science. Launching the ESCI allowed Web of Science to lower their inclusion standards, admit thousands of journals and their backfiles, and still only bestow Impact Factors on the journals that meet a higher standard.

Web of Science’s Core Collection and Elsevier’s Scopus are largely duplicative of each other. New “baskets of metrics” have been added to both as they compete for library subscriptions.

Journals that do not have an Impact Factor face major uphill battles. Authors care very much about Impact Factor, despite some groups trying for decades to move the dependence on the Impact Factor into the trash bin. New journals that lack the word “Nature” or “Lancet” in their titles may have to wait years (3 years would be the minimum — some exclusions apply, see note below) if not decades for an Impact Factor.

Note: some brand new journals are granted partial Impact Factors because they are deemed important to the field based on title and editor affiliations alone.

Having a journal accepted into the Core Collection, even if just in the ESCI, means that the content of the journal is discoverable in the Web of Science database. It also shows that you have met the quality criteria. And mostly, it signals to potential authors that you are likely to get an Impact Factor at some point. So to this end, the ESCI does benefit included journals. That said, there is the purgatory problem.

Journals in Purgatory

Journals can sit in the ESCI for a very long time, or forever even. This presents a few problems.

You can’t be “emerging” forever. You either emerge or you don’t. Calling something an “Emerging Source” indicates that journals will get promoted at some point in the not so far off future. However, I know of several journals that have been in that Index since 2015 or 2016. I don’t mean journals that launched in 2015, but older journals included in the database within the first year of ESCI.

To remain true to the name, those journals named “emerging” that have not been deemed “emergent” in the last 7 years either need to be kicked out or promoted. It’s like having someone as an intern for a decade — it no longer looks like an honor.

According to the Clarivate site, a journal in the ESCI whose estimated Impact Factor would put it in the top half of the subject category (Quartiles 1 or 2) will likely get an Impact Factor. Again, I know of journals that hit this mark and are still in ESCI.

Clarivate is not going to want to reduce the collection by eliminating journals they already decided met the 24 quality markers. And based on what they claim is in the ESCI, they would potentially be eliminating a fair number of open access journals, niche regional journals, and journals from the “Global South”. In essence, it would mean punishing journals because their authors and editors aren’t established enough.

Clarivate had two options — arbitrarily set a time limit for journals in the ESCI or give them all Impact Factors.

Unanswered Questions about the ESCI

My first question is whether the ESCI is being eliminated. It’s possible that Clarivate will keep the ESCI and just put Impact Factors next to each title it includes. This has the potential of a public relations nightmare. (See below).

Journals care about two things regarding the Impact Factor — the number and the rank. As the last several years have shown, Impact Factors overall can have wild swings due to changes in the corpus, changes to what is counted and when, and huge environmental impacts, such as a global pandemic. If every journal in a category goes up, or down, rank is the next way to differentiate yourself from the competition.

Journals that have an Impact Factor but are still in a separate list will sort themselves for promotion purposes into the ranking list. No one will know what is going on. Confusion has already ensued with the Journal Citation Indicator, a metric rolled out last year. The JCI field normalizes citation impact of journals and provides an indicator number to all journals in the Core Collection, even if they don’t have Impact Factors.

The introduction of the JCI was met with some skepticism. The Impact Factor has long been criticized because of the lack of transparency. Clarivate has offered new tools over the last several years to take the mystery out of the Impact Factor.

The JCI seems to go in the opposite direction. In fact Clarivate described it this way:  “Although the calculations behind the Journal Citation Indicator are complex, requiring considerable computing power, the end result is simple:  a single value that is easy to interpret and compare, complementing current journal metrics and further supporting responsible use.”

Perhaps giving all journals in the Core Collection an Impact Factor will ease the confusion about why some journals have one metric and other journals have another metric. Presumably now all journals in the Core Collection will have an Impact Factor and a JCI.

It’s also possible that ESCI will remain, but with far fewer journals. Perhaps those journals that have been around for less than three years or don’t garner enough citations (a threshold perhaps) will stay in the ESCI while the others go to their respective indices.

In theory, this change announced today will speed up getting an Impact Factor for most journals. What will that average wait time be? Will journals that meet the 24 quality markers automatically get an Impact Factor in the 3rd year?

Clarivate currently moves journals from SCIE and SSIE to the ESCI if they no longer meet the qualifications of the impact measures. If ESCI is eliminated, will those journals just stay in the Journal Citation Reports? Or will we see journals moving between indices as their impact changes.

I have asked these questions but did not get a response in time for posting today. (See below for answers)

What About the Decimals?

The second big announcement today is that with the 2023 release, Clarivate will “display” Impact Factors with only one decimal place instead of three!

Clarivate states that they want to “encourage users to consider the other indicators and descriptive data in the JCR when comparing journals”. It should be noted that not everyone that cares about a journal Impact Factor has access to the JCR. Those without access that have been relying solely on the multi-digit Impact Factors and ranks will lose some of their ability to independently assess journals.

Presenting numbers to the thousandths has been ridiculed for a long time. The rationale given previously is that it is necessary to determine rank. This is undoubtedly true. Often in the middle of any category you see journals that are merely a hundredth or thousandths off from their nearest competitor. It’s a virtual tie — well, now an actual tie!

Reducing to one decimal point devalues rank and focuses on the actual number. This may be a harder story for a journal to tell.

The bigger question is how many journals, particularly those in the ESCI, will have an Impact Factor of 0.0. I don’t know what the lowest Impact Factor is. I saw an undated list online that claimed 0.017 was the lowest. A small journal that publishes 20 papers a year need only get one citation to have an Impact Factor of 0.05 that rounds up to 0.1. But if they publish 23 papers, they slip into 0.0 land.

What happens to those journals with a 0.0? Are they kicked off the list or will they actually show as having no, or virtually no, citations?

Arts and Humanities

I am not purposefully ignoring the fact that journals in the Arts and Humanities Citation Index have never received Impact Factors and now will get them. I just don’t personally know what this means for those journals. Do they want Impact Factors?

Most anecdotes I can find online indicate that some in the humanities don’t find the two-year citation window in the Impact Factor helpful. Papers are much longer, peer review takes more time, and the peak citation window is no where near the two-year Impact Factor period.

I would love to learn more about whether AHCI journals think getting Impact Factors is a benefit or a curse.

Conclusion

It is clear that Impact Factors are here to stay and also that Clarivate does seem to be reacting to the criticism against Impact Factors, oddly by giving out more of them. Having more journals with Impact Factors should not negatively affect journals already with Impact Factors, as citations from the ESCI journals have counted toward Impact Factors since 2015.

The larger impact is what this means for assessing journals. We need to have clarity on what will happen to ESCI and whether the JCR inclusion criteria are changing. In the meantime, journals in purgatory can celebrate as they await their forthcoming Impact Factors.

Update

Dr. Nandita Quaderi, Editor in Chief and Editorial Vice President, Web of Science provided the following answers to my questions. Unfortunately the response got held up in transit and was not received in time to be included in the original post.

Q: Is the ESCI being deprecated?

A: No there are no plans to make product-level changes as a result of expanding JIF eligibility – ESCI will remain, as will the three discipline-specific indexes. Expanding the JIF to ESCI journals will help  them demonstrate their value to the research community. This decision is aligned to our position that publications in all quality journals, not just highly cited journals, should be eligible for inclusion in research assessment exercises

Q: If all journals get an Impact Factor, why separate them into the different databases that aren’t dependent on subject matter?

A: The difference between multidisciplinary ESCI and the three discipline-specific indexes is that to enter the discipline-specific indexes, journals must pass both the 24 quality criteria and the additional four impact criteria. This is not changing as a result of JIF expansion.

Q: If it is not being eliminated, will the journals in ESCI appear in the appropriate subject rankings?

A: There are no plans to display ‘combined’ category rankings that include flagship (SCIE, SSCI, AHCI) journals and ESCI journals. In next year’s JCR release, each category will have a ranking for flagship  journals and a separate ranking for ESCI journals.

Q: Currently, there is criteria for inclusion in the ESCI and different criteria for inclusion in the SCIE, SSCI, or AHCI. It is my understanding that all journals must meet the 24 quality criteria, but that there are four additional impact criteria required for getting an Impact Factor. Is this new policy just eliminating the four impact criteria?

A: The 24 quality criteria and the four impact criteria will remain. All journals, including those indexed in ESCI, need to pass the 24 quality criteria that are designed to select for editorial rigor and best practice at the journal level. Journals indexed in SCIE, SSCI and AHCI need to pass the four impact criteria designed to select the most impactful journals in their respective fields using citation activity as the primary indicator of impact. Details of our evaluation process can be found here: https://clarivate.com/products/scientific-and-academic-research/research-discovery-and-workflow-solutions/web-of-science/core-collection/editorial-selection-process/editorial-selection-process/

Q: The release states that over 9,000 journals will get Impact Factors. Can you provide a breakdown like how many from ESCI and how many from AHCI?
We estimate that over 1300 AHCI journals will get a JIF and over 7600 ESCI journals will get a JIF. These numbers will be approximate until the time of JCR extraction next year. Approx. 450 AHCI journals already have a JIF as they are also covered in SCIE and/or SSCI

Q: Are there other sources of journals in the Core Collection that currently don’t get an Impact Factor?

No. There are four journal indexes in the Core Collection. Journals in SCIE and SCIE already get a JIF; from next year AHCI & ESCI journals will also get a JIF.

Q: Current Clarivate policy states that a journal with an Impact Factor that slips in citations could be “demoted” to the ESCI database. Will there be new rules about journals that “lose impact”? Could they lose their Impact Factor?

Ask slightly differently: Once you have an Impact Factor, and as long as you meet the 24 quality criteria, would you ever lose an Impact Factor?

A: All journals that have passed the rigorous Web of Science quality criteria will receive a JIF. Journals will not lose their JIF if they move from flagship to ESCI. However, JIF suppression analysis will be extended to ESCI and AHCI journals. 

Q: Regarding the three decimal places, what impact will this have on Rank? I would expect there to be a whole lot of ties.

A:  Yes, there will be a lot more ties, particularly in the lower quartiles.  We see this as a positive development that will encourage users to consider the other indicators and descriptive data (for example, the Journal Citation Indicator [JCI]) in the JCR when comparing journals.

Q: Are there journals currently with Impact Factors so low that they won’t register for an IF anymore (like 0.009?)

A: If the calculated JIF is 0.049 or lower, the value <0.1 will be displayed.

 

Angela Cochran

Angela Cochran

Angela Cochran is Vice President of Publishing at the American Society of Clinical Oncology. She is past president of the Society for Scholarly Publishing and of the Council of Science Editors. Views on TSK are her own.

Discussion

26 Thoughts on "The End of Journal Impact Factor Purgatory (and Numbers to the Thousandths)"

Angela thank you for your very thoughtful and thorough review of Clarivate’s new policy on Impact Factors. What will this mean for publishers who are launching new services that measures the impact of institution’s research? Will these new factors bring more clarity or confusion to the library community. Ultimately will these new factors help the industry to gauge the importance of one journal over another?

I guess time will tell.

Hi Angela, I am very disappointed to see this as we provided full answers to each of your ten questions yesterday, within three hours – did you not receive them?

My apologies, Lisa. I don’t know why they got swept up in quarantine, but I should have checked. I have updated the post with answers from Dr. Quaderi.

In my reading, these decisions (save the one-decimal change) will promote the value of the JIF and ESCI. The alternative Journal Citation Indicator (JCI) metric never really caught on and further isolated journals in the ESCI. Further, I don’t believe that the ESCI had much of a market share anyway. Over the years, several publishers confided in me that Clarivate representatives told them that purchasing a subscription to the ESCI may help get their journals out of “purgatory” and into the Core Collection. These publishers often balked at the suggestion and interpreted as a form of pay-to-play.

The one-decimal reporting will create a lot of ties, but those journals that promote their ranking are ranked very highly (“ranked first in the field of X!”). 79th out of 150 makes less of a marketing pitch, whether you are tied or not.

Thanks, Angela, helpful blog. I was a little confused by the following line in the Clarivate blog: “This decision [expanding the JIF to all journals] is aligned to our position that publications in all quality journals, not just highly cited journals, should be eligible for inclusion in research assessment exercises.” – https://clarivate.com/blog/clarivate-announces-changes-to-the-2023-journal-citation-reports-release/
I would think research assessment should be done using articles put forward by the researchers being assessed. These could have many citations, altmetrics or download, or any other reason why a researcher deems the article to be of relevance for his assessment. JIFs do have no role at all to play in research assessment, au contraire, they may well do more harm having now even more JIFs to go around…

“JIFs do have no role at all to play in research assessment”

If only that was true. It is not – many researchers do not care enough to take the time to look into those things and only search for journals with his IFs or citations.

I suspect that Clarivate is setting things up for a lot of confusion going forward. For most users, there will seem no clear distinction between the ESCI and the JCR — if the criteria remain roughly the same, and only journals that have a JIF in the top 2 quartiles of their category are able to move out of ESCI and into the JCR, then the whole “content significance” criterion goes out the window. You’ll have a ton of journals (the bottom quartile and much of the 3rd quartile) in JCR that are cited less and have lower JIFs than many of the journals relegated to the minor leagues of ESCI. Those less-cited journals will remain in the JCR solely because they happened to enter it before ESCI existed and the criteria changed.

Unless Clarivate is going to do a large scale clean-out of the JCR, and shift huge swathes of low-performing journals to ESCI, the whole thing will be an indecipherable mess.

I had not thought of that. Clarivate’s site does make it clear that they can move journals from JCR to the ESCI. I don’t know if that has ever happened, but it seems a moot point if they all get Impact Factors. And whether a journal moves to the JCR is dependent on some fairly subjective measures and niche journals will still have trouble moving up. Though, whether it matters if they all have IFs anyway remains to be seen.

I suspect most people will just look at the numbers and ignore whether the journal is in ESCI or JCR.

This sounds similar to what the National Library of Medicine did with PubMed. Today, I don’t think anyone discerns whether a journal is indexed in Medline or PubMed Central. Most publishers just care whether their journal is discoverable on the PubMed platform.

ESCI and AHCI journals are already in the JCR from last year. Wouldn’t that mean they are not going to clean *out* JCR, just add more stuff to those journals *in* JCR?

Sorry, I’m mixing up my terms (yet another source of confusion for WoS users). The JCR has previously only included SCIE and SSCI, and now it will include JIFs for both SCIE/SSCI and ESCI, but separately. What I’m talking about is that they should probably remove swathes of journals from SCIE/SSCI that fail to meet the criteria they’re putting on ESCI journals, as many (which were put in SCIE/SSCI before ESCI existed) probably don’t meet the “significance” levels they’re holding ESCI journals to. Hence you’ll have journals with much stronger citation performance in ESCI than many in SCIE/SSCI, and the deciding factor for where it goes will be the journal’s age and when it was first indexed, rather than its importance.

As I understood it from Clarivate at the time, the launch of ESCI was not just about emerging journals but also about content from ‘emerging countries’ which may be of global significance, as well as those titles covering emerging subject-areas. T&F lobbied on behalf of our regional journals, especially those edited and published from the ‘Global South’, and we were pleased with the outcome at the time. This new development should probably also be welcomed on EDI grounds, so long as some of the ESCI inclusion criteria remain and expansion continues.

This is good news for all the journals in the ESCI– all of which have to meet the 24 quality measures. I think where ESCI went wrong is that some journals met the 24 measures but aren’t meeting the the 4 impact measures. I really don’t think Clarivate intended for journals to be in the ESCI forever, but that is what is happening. Now journals won’t care if they are only in ESCI as long as they have an Impact Factor.

But the question is: journals that remained in the ESCI for several years were evaluated or not? Did they remain there because they were not meeting the 4 impact measures or simply because nobody took care of evaluating them? Clarivate never answers this kind of questions and this makes journals that stay in ESCI for several years not so good independently of whether they are underperforming or not.

Great post, Angela. Thank you for shining a light on one of the more mysterious corners of scholarly publishing in a way that us non-editorial types can make sense of.

Thanks Angela for the detail info! I received a quick heads-up call from Clarivate the evening before about this but couldn’t raise many similar questions that you had. One of the questions that I had was: will there be cases that the JIF of a journal in ESCI higher than the JIF of other journals in SCIE of the same category (which I think is very likely)? Will that create even more confusion? Anyway, I still think that it is a good move to tune down the importance of using just one number to qualify the quality of a journal.

Interesting, thanks Angela for this thoughtful post, another point to consider along with the WoS competition from Scopus, is the emergence of Digital Science’s Dimensions database, which also tracks a broader set of publications, and makes the links between grants/funders, policy documents, patents, preprints, data sources and clinical trials – there are a few articles that have analyzed all these different databases, not that they are apples to apples comparisons of course, but here’s a couple of links for those interested https://link.springer.com/article/10.1007/s11192-022-04309-6 https://www.frontiersin.org/articles/10.3389/frma.2020.593494/full

ESCI seems to be moot as a product. In a nutshell, Clarivate is responding to Elsevier Scopus and its CiteScore. As a freely accessible tool, CiteScore metrics deliver the insight into research impact and influence. And, Els makes it very easy for journals to badge their sites with a widget or an image. With all this in mind, I wonder if the future of the JCR’s JIF will allow for more (free) accessibility, in order to keep it in the audience’s ‘hearts and minds’?

Thanks for the post, Angela. Well done.

Dropping down to two decimal places is a major achievement and in fact quite brave of Clarivate. I remember when the DORA committee (https://sfdora.org/) put this to the then owners of the Web of Science/Science Citation Index, Thomson Reuters, as one of the more important enhancement suggestions more than 7 years ago, the answers was very clear: we know the second and third decimal place make little sense statistically, but they are necessary for a clear rank order, and this is what our ‘service’ is based on.
Collapsing the journal rankings into ties – and ideally bins is much more than just a cosmetic change and will ensure that algorithmic research assessment is further confounded. (not that these ranking don’t have other serious issues such as opaque inclusion criteria and conflating review and research journals (which the JIF does also still do).

First of all Impact Factor for Journals has become a number game and indirectly controlled by the Authors and their choices to publish in a particular journal. It has become contagious. For example, if a string of authors publish in a low Impact factor journal, a few years down the lane, you can see the journal to rise up in its impact factor. What the journal cannot not do has been done by a string of authors. On the otherside, refraining from publishing their papers in an an already established journal may kill the impact factor of the established journal. It has become a dynamic stock exchange index than a scientific index. My opinion would be to alternate the metrics with something more meaningful in terms of utility of papers published in journals. I have a simple method to assess. Why go so complex introducing metrics after metrics and driving an unnecessary competition among journals and among authors and the research fraternity? It is like a business game than a scientific game. What actually the goals of CV or Scopus caring for? Can’t there be logical and true metrics that are synergistic? Today’s science is application demanding to have problems solved at the earliest for the benefit of mankind and ecosystem. So, let’s organize the metrics in line with public and societal impact be it for the journal or authors. Why there is a need to publish tons and tons of research papers if only some thousands are really contributing. It has become a prestige issue and source for survival too in many countries than looking as a contribution to R&D. Just think why so many journals have emerged over the years? How many of them are really contributing to social and societal problem solving. On the other extreme we have Patents to add to confusions. The increase in numbers 360 degrees have led to chaos and now we are in deep trouble identifying between the good, better, best and worst. It’s my request to great leaders to arrive at some using metric parameter that is simple, logical Motivating, strict, socially relevant and so on at the earliest possible. Metrics to be evolved for both Journals, and Individual Authors. Current metrics are a beautiful well designed complex number game that should fade away with time. All the above are my personal thoughts. Take it positively please.

No matter what Clarivate’s Web of Science (or Elsevier’s Scopus, or Digital Science’s Dimensions and Altmetric) does in regards to journal scores and research assessment, there will always be questions, complaints, and maybe even conspiracy theories. But at the end of the day, trying new things and evolution are mandatory for products and industries as a whole in the modern, everything-is-tech word. So kudos to Clarivate for moving forward. Besides, this type of news gives blogs like TSK something new to write about!

Comments are closed.