Earlier this year, a U.S. District Court ruled that publisher and conference organizer Srinubabu Gedela and his companies OMICS, iMedPub, and Conference Series violated the U.S. FTC Act “by making deceptive claims regarding their academic journals and scientific conferences, and by failing to adequately disclose their publishing fees.” The Court imposed a number of requirements as well as a judgment of $50.1 million. 

Today I’d like to reflect on the implications that this ruling may have for institutions — those that employ researchers and those that fund researchers, especially as this will by no means be the last enforcement action taken against publishers accused of deceptive practices.

How might an institution repair a tarnished reputation? And, given the reality of fraudulent publishers and their deceptive practices, will institutions consider more strongly guiding author choice of publishing venue in order to protect institutional reputation? 

Mending the Tears, Winslow Homer, The Metropolitan Museum of Art

Institutional Interests

Universities and funders often herald the achievements of their researchers in order to garner positive press coverage, bolster their reputations, or recruit new employees. University rankings and memberships many times depend heavily not only on measures of research activity and quality but also on the impression they generate — the brand identity if you will — of their quality. 

Human subjects mistreatment, research misconduct, athletics scandals, fraud and other misuses of government funding, fundraising outrages, admissions corruption, and the like, by even a small number of individuals, regularly do damage to an institution’s reputation. These reputational missteps are damaging in a variety of ways, from fundraising to admissions applications, and they thereby harm the careers of these institutions’ faculty members and students. 

Across institutions, a great deal of effort is put into a variety of risk management mechanisms to ensure compliance with legal, financial, and ethical frameworks and encourage good research practices. This work is sometimes tagged as bureaucracy, or worse yet bloat, but even so every month the Chronicle of Higher Education brings us more stories about failures to protect universities from the worst instincts of their stakeholders.  

Can Institutions Guide Choice of Publishing Venue?

Given the connection between researcher activity and institutional reputation, it is somewhat surprising that institutions have not moved more aggressively to guide author choice in publication venue. One can imagine that administrators might choose to tread lightly in this realm, given the importance of intellectual and academic freedom. However, it is also the case that review, tenure, and promotion guidelines, as well as funder mandates, do already shape researcher choice, whether for journals that have good reputations or for those that offer required open access and copyright licensing terms. 

So, it is not impossible to envision mechanisms for shaping researcher choice of publication venue; they already exist. These shaping forces are typically framed as in the pursuit of something positive — e.g., a venue of perceived quality or a pathway for open access. The question is, then, should institutions also employ shaping forces to prevent something negative — i.e, publication in a venue that engages in deceptive practices?  

Are Institutions Paying Attention?

Over the past few years, there have been a number of analyses that looks at the extent of publishing in predatory journals in particular fields and disciplines. Typically drawing on Beall’s List or its progeny, which are admittedly problematic in a variety of ways, these investigations seem to result in general discussions about better support for authors in making their decisions about where to publish through training and checklist approaches. 

We have also seen some examinations of the publications of a specific group of faculty. When these have been shared publicly (one can reasonably suspect that there are more investigations than those publicly disseminated), it seems there has been a “shoot the messenger” response;  at least one faculty member was banned from his own campus after publishing a report on the topic

This may in part explain why the OMICS ruling does not seem to have garnered the kind of concern or analysis — at least in any public way — that I had thought it might. When I didn’t see much immediate discussion after the ruling, I put out an inquiry on a few listservs and Twitter:

Have any librarians been contacted by your admin wanting to know if your faculty published in OMICS journals after the FTC judgment? If don’t want to reply publicly, email lisalibrarian@gmail.com? Will keep confidential

The responses were not hard to keep confidential as the only replies I received were librarians saying that they had not been contacted. 

There are many possibilities here. Of course it could be the case that these investigations went forward internally without the involvement of the librarians. But, it is also possible that research administrators are unaware of the ruling. Or, that they are aware of it but assume that, while authors at other institutions might publish with OMICS, their own would not.

It seems that funders might also want to guide their researchers away from publishing in fraudulent venues but here too the approach seems to have been education and advice. A staff member at one funder did ponder if there was an ethical obligation after the OMICS ruling to alert their grantees who had published in OMICS. 

I would be curious to know this as well. I know a few grantees have published in OMICS journals in the past and am unsure if I should do anything.

This is something I haven’t seen in any discussions about fraudulent publishers. Is there any obligation of care to alert individual authors who might find themselves with a tarnished reputation as a victim of fraud? And, if there is, who has this obligation? 

What Could Institutions Do?

In light of the potential risk to an institution, I worry that the reality of fraudulent publishers might be one of those things where institutional leaders are crossing their fingers and hoping for the best without preparing for the worst. 

At a minimum, it seems that institutions should be monitoring their risk exposure. Dimensions Analytics offers, for example, the ability to search for articles in journals from a specific publisher with a given institutional affiliation or funding source. Institutional repositories, research information management systems, etc. might also be able to assist in this monitoring. 

From an ethos of care perspective, institutions may also wish to reach out to their authors and alert them that their research reputation may be tarnished if they have been the victims of fraudulent publishers. Institutions might also provide support for authors who wish to try and withdraw their publications or seek refunds of any payments they made to a publisher. 

Finally, as discussed above, institutions may wish to put in place more proactive strategies for guiding authors when they are making their decisions about which journals to submit their manuscripts to. Such could be framed as an optional support service. Funders may be in a position to use their grant guidelines to enforce blacklist/whitelist compliance for grantees. Institutions may also wish to explore options for refusing to process payments to certain publishers to avoid being complicit in fraudulent activity. 

Lisa Janicke Hinchliffe

Lisa Janicke Hinchliffe

Lisa Janicke Hinchliffe is Professor/Coordinator for Research and Teaching Professional Development in the University Library and affiliate faculty in the School of Information Sciences and Center for Global Studies at the University of Illinois at Urbana-Champaign. lisahinchliffe.com

Discussion

8 Thoughts on "Repairing an Institutional Reputation Tarnished by Fraudulent Publishing"

Actually it is not really easy to monitor fraudulent publishing with Dimensions Analytics or similar services. OMICS or similar large scale publishers which are usually tagged as predatory are probably only indexed in Dimensions because they have DOIs registered at Crossref. As soon as they are dropped from Crossref they will disappear from most of the indices which rely on DOI. It is the case for instance with OMICS in Dimensions: there were more than 11000 OMICS publications from 2018, but there are less than 600 from 2019 after OMICS was dropped from Crossref. (https://app.dimensions.ai/discover/publication?search_text=10.4172*&search_type=kws&search_field=doi)

Another problem is that the metadata from these publishers are usually of low quality, which means that the matching of affiliations and funders might ve really difficult or impossible. OMICS have at least 95000 registered DOIs in Crossref with the prefix 10.4172 (https://api.crossref.org/works?filter=prefix:10.4172&rows=0), and only 103 (!) of these have deposited affiliation data in Crossref (https://api.crossref.org/works?filter=prefix:10.4172,has-affiliation:0&rows=0)

There are probably hundreds or even thousands other questionable journals or publishers which do not even bother to register DOIs and these are probably not even indexed in the databases.

Thanks for this comment. I definitely don’t think this is necessarily easy to do. The tool set to assist an institution is not robust. It seems that Dimensions must not be relying exclusively on CrossRef affiliation data given the search results I get there though it is the case that the data is messy. Regardless, the point is well taken that no one source is going to identify everything. At this point, it might involve a more manual process – e.g., take the lists of publications reported by one’s authors (at a university or those with grants from a particular funder), eliminate those publications in “white listed” journals (e.g. indexed in WOS, Scopus, and/or disciplinary databases), and then examine the remaining.

Indeed, mechanisms for shaping researcher choice of publication already exist. And I don’t think it is a good thing. For a few years now, more and more business schools have developed “approved journals lists” which list the journals professors need to publish in in order to get promotion and tenure. Invariably, these lists name the top journals in the field and thus prevent authors from publishing in predatory journals. But effectively, what these lists do is marginalize researchers in narrow fields. For example, if only Academy of Management journals are on the approved journals list (which is the case for quite a few schools) supply chain management researchers are competing for just a handful of spots a year in those those journals–effectively locking out a lot of good research. These lists also feed into the idea that a good journal has to have an impact factor. The lists just create a breeding ground for an ultra competitive environment that leads to petty infighting in a department at best and unethical behavior at worst. Predatory journals are a real problem. But researchers have to be be allowed to publish where they feel like the best outlet for their research is–not what administrators think the best outlet is, which often is based on something other than the quality of the research itself. Perhaps SSP needs to develop an educational campaign to help researchers find the best publishing venue and to identify predatory journals for themselves instead of relying on someone else to do it for them.

Very interesting. I am aware that in some countries, management journals are ranked and cash awards are given for each paper that gets into a top ranked journal. I suppose that’s not as harsh as saying “here is a list of journals you should get into in order to get a promotion,” but policies like this really limit researchers in niche subdisciplines or interdisciplinary work.

One could perhaps imagine an approach where it is a matter of whether an author is allowed to list their university affiliation rather than whether they may publish in the venue at all?

Think Check Submit already exists. Of course, if someone has been duped, the point is that they didn’t think the journal was fraudulent. That’s what makes it fraudulent — that they are able to convince people they are doing something (e.g., peer review) that they are not. Deceptive practices are a problem … because they deceive. Think Check Submit can’t help with that because if you do through the protocol you’ll conclude (because you are deceived) that the journal is fine.

“Typically drawing on Beall’s List or its progeny”

I still wonder about the availability of curated lists. Lisa, does your library subscribe to Cabell’s journal analysis services? If so, is it widely used? Their website says the have over 750 universities subscribing, which sounds like high uptake considering there’s about 1000 research universities in the world. Still, I question the broad availability of listings. OMICS may have become infamous, but there’s a lot of other publishers writing their flattering invitations day after day.

There are 963 universities ranked in the Leiden ranking:
https://www.leidenranking.com/ranking/2019/list
“These are all universities worldwide that have produced at least 1000 Web of Science indexed publications in the period 2014–2017.”
So probably there are at least a few thousand universities in the world which publish in Web of Science indexed journals, but there must be many thousands of universities altogether:
https://www.quora.com/How-many-universities-are-there-in-the-world-4

There are almost 3000 colleges and universities only in China:
https://en.wikipedia.org/wiki/List_of_universities_in_China

According to Cabell´s homepage they have over 11000 journals listed in their curated whitelist:
https://www2.cabells.com/about-whitelist

Just for comparison:
There are more than 21100 journals indexed in Clarivate’s Web of Science Core Collection:
https://clarivate.com/webofsciencegroup/solutions/web-of-science-core-collection/
There is a freely avaible of list of the indexed journals:
http://mjl.clarivate.com/
Almost 12000 of these have Journal Impact Factor according to Clarivate’s Journal Citation Reports homepage:
https://clarivate.com/webofsciencegroup/solutions/journal-citation-reports/

Scopus indexes at least 22000 journals. They have a freely available list at:
https://www.scopus.com/sources

Comments are closed.