An example of cultural bias: Density of geotagged wikipedia entries.
An example of cultural bias: Density of geotagged wikipedia entries.

I can tell that it’s summer. Edinburgh is plastered in posters for comedy acts, the students have been replaced by tourists and festival goers, and there is a brief lull in the number of publishing conferences. It’s the silly season, where issues have a tendency to resurface and old arguments flare up. One such incident happened last week that resulted in bit of storm in the Twitter and blogosphere. Jeffrey Beall wrote a commentary on the South American aggregator and citation index, SciELO.

There was quite a bit of anger particularly among open access (OA) advocates in response to the post, much of it expressed on Twitter. Many pointed to Beall’s culturally charged use of the word “favela” to describe SciELO, with some accusing Beall of being classist or derogatory. The word racism was used, albeit quite parenthetically.

Some responses relied on arguments of unfairness and lack of transparency in Beall’s eponymous list, which isn’t really relevant here. It’s important to remember that Beall did not call SciELO predatory. I’m sure that he would concede that SciELO are a legitimate and reputable publisher. Instead, Beall contended that SciELO was a bad place for scholars to put their work. Here’s part of what he wrote:

Many North American scholars have never even heard of these meta-publishers or the journals they aggregate. Their content is largely hidden, the neighborhood remote and unfamiliar.

In other words, since SciELO isn’t as well known in the US and Canada as are other publishers, nobody’s going to read it and it’s a bad place to publish. That’s a pretty tough point of view to defend. To support it, Beall claims that SciELO isn’t indexed in Web of Science or Scopus. He conceded that you could find the work in Google Scholar, but contends that the search engine is, “poisoned by fringe science” and so scholars won’t find SciELO’s content that way. The message here is that publishing with nationally run or regional publishers is not in the best interests of local researchers. There’s a few levels of wrong going on, but even if you agree with the value judgements being made, Beall’s argument would be flawed simply because the supporting evidence is incorrect.

An article from 2010 by Samaly Santa and Victor Herrero-Solana in Investigación bibliotecológica examined the coverage of Latin American journals (including but not limited to SciELO content) in Web of Science and Scopus. They found growing coverage even then. Since then, Web of Science has partnered with SciELO to create the SciELO citation index, which to his credit, Beall acknowledges. He does however question how many libraries would license it. In reality, it’s actually part of the core collection, so there is certainly very good coverage. According to a post on the SciELO blog by the Brazilian Forum of Public Health Journal Editors and the Associação Brasileira de Saúde Coletiva, the entire SciELO catalogue is currently indexed in SCOPUS. Finally, content usage on SciELO is healthy. According to a recent email I received from Director Abel Packer, the platform serves an average of just over a million downloads per day. All told, the discoverability of SciELO content is going from strength to strength and they are becoming increasingly important internationally.

Aside from factual inaccuracies, there’s a bigger issue here that most haven’t addressed with the notable exception of the post that I mentioned above on the SciELO’s own blog. In the second paragraph from the bottom, the authors reference this article by Hebe Vessuri et al in the journal Current Sociology. To quote from the abstract of the article:

…by relating excellence to quality differently, a research policy that seeks to improve the level of science in Latin America while preserving the possibility of solving problems relevant to the region can be designed.

It goes on to say:

…designing a science policy for Latin America (and for any ‘peripheral’ region of the world) requires paying special attention to the mechanisms underpinning the production, circulation and consumption of scientific journals.

What the authors are saying is that in order to develop the best research policy for Latin America (and by extension perhaps other regions) it is important to maintain a locally run publishing industry. Why? For example, when editors-in-chief and publishers are drawn from the local community, it creates a platform to publish and promote work that serves the public good in that specific region. The Leiden Manifesto includes this issue as point three in their list of ten principles, citing regional aspects to HIV research. It’s easy to imagine other examples, such as the perceived importance of research into Yellow Fever or Mayan archeology compared to say, Alzheimer’s disease or obesity.

The need for community based publishers with local editorial control is not just limited to emerging markets. As I wrote in a previous post, many university presses and library publishing operations specialize in supporting niche or locally relevant research. In March, Tanya Samman, Jenny Ryan and Michael Donaldson from Canadian Science Publishing wrote a guest post for my Perspectives blog, in which they stated:

As a nationally-based publisher, CSP’s values reflect those of the Canadian research community and of Canadian society in general. Although the research community is still heavily influenced by journal impact factors when determining where to publish, more people are starting to look beyond the impact factor for value added features in their publishing choices, such as integration of multimedia, social media support, premium author services, etc. Community support and engagement is integral to providing such value added services.

The way in which we think about academic excellence is slowly but surely changing over time. There has been a lot of talk about alternative metrics, socio-economic impact of research, data publishing and even changing how authorship works. Almost all of the talk has been based around the needs of markets like North America, Europe and Australia. As the Leiden Manifesto attests, in the field of informetrics, there is a consensus that local excellence should be preserved and encouraged but so far, many publishers and librarians haven’t entered that discussion.

It’s going to be a difficult conversation to navigate as there are a lot of preconceptions and prejudices to be tackled. There is confusion about how to handle new entrants into the market (for the record, SciELO was founded in 1997) and what constitutes legitimate contribution as opposed to predation. The discussion surrounding emerging markets needs to mature significantly and a more careful and nuanced approach is needed. There is a real danger that the current tone in the discussion of predatory publishing could lead to a guilt by association of all publishers based in the non-English speaking world and that would not only be entirely unfair, but damaging to the public good.

Phill Jones

Phill Jones

Phill Jones is Director of Publisher Outreach for Digital Science. His job is a complex and nebulous thing with seemingly ever evolving responsibilities. These days, he spends quite a lot of his time working in the Consultancy, drawing on Digital Science's data and expertise to support decision making for publishers, institutions, funders and governments.

View All Posts by Phill Jones


56 Thoughts on "Defending Regional Excellence in Research or Why Beall is Wrong About SciELO"

That Beall’s deeply suspect ‘cultural bias’ (to put it politely) has resurfaced yet again is no great surprise. What is a surprise is the relatively low level of criticism that he faces from within the community, with the notable exception of commentators such as Karen Coyle on Library Journal.

It’s worth reminding ourselves of some of his previous unpalatable asides. For example:

“None of the journals has any content yet, but they have thrown together some editorial boards, mostly people from India.” (

and his ‘best’:

“Is this the future of scholarly publishing, dumbed down and offshore?” (

Furthermore in my experience, If one tries to take him to task by commenting on his blog, he avoids debate by simply not publishing comments he does not agree with.

Why the community continues to engage in any way with Beall is completely beyond me.

I simply can’t agree about a “low level of criticism” of Beall. Beall has a multitude of critics, and he deserves them. I wrote about Beall on the Kitchen a while back: “Parting Company with Jeffrey Beall”– What more would you have us do? Meanwhile, who else besides Beall has sought to identify publishers that seem to exist solely to extract money from authors?

David, Could you please list five publishers (or standalone journals) that you believe should not be included on my list?

Beall, could you please list what publishers were once in your list and have since been retracted? I know of at least Hindawi and MedKnow. It’s unclear how many such false positives have had their reputation possibly, probably, or potentially damaged. So much for the gray area.

I have some questions for Beall. I politely request Beall or anybody to answer these doubts and questions. (I am sorry for my incorrect and weak English)

1. What is the outcome of Appeal process? Is there any public data? I am particularly interested to know the names of successful candidates from developing countries, as most of the people complain that Beall is harsh to these players. Appeal process was established in 2013-03-11. How many publishers/journals were removed till date? Beall once told that he removed only two publishers till date from his list. Reason: These two publishers left open access model and embraced subscription model.
2. Why there is no public display of contact of “four-member advisory board” in the Appeal page ( If any new journal fails to show the contact details of its editors then we can call that journal predatory very quickly. But Beall failed to provide the contact details of the advisers in last 2.5 years.
3. How many days it require to remove all publishers (and journals) from Beall’s list? Beall has provided us a simple calculation here: “Appeals are limited to one every 60 days. ((” As of today 860 publishers are available in the list. So if all publishers want to appeal then to process the appeals it will take 141.4 years (irrespective of the result). As of today 741 standalone journals are available in the list. So if all journals want to appeal then to process the appeals it will take 121.8 years (irrespective of the result). So cumulatively it will take 263.2 years to process the appeals. For me this appeal process is simply an ‘eye-wash’.
4. Why Beall is hiding one of his famous publication about open access ( ? I could not see it here:
5. Is Beall not interested to remove publishers from his list? Initially link for the “Appeal” was prominently placed on the top banner of the blog. Many appeals were posted. Then slowly Beall placed that link on the “Other pages”. So fewer appeals were there. Then Beall slowly placed the link of Appeal at the end of a very very long page ( So the frequency of appeal further reduced.
6. Dove press appeared in Bealls list 3 months ago without any discussion. Suddenly it disappeared on 4th August without any justification. ( Beall can you give some specific reason why you have removed Dove press and why you have included it also?
7. Finally most important question. What is the fate of successful candidates who passed the test of Bohanon’s sting operation ( I have discussed this issue here: Most probably the answer is: A predatory publisher can not be successful irrespective of any kind of evidence of improvement. This is quite frustrating scenario. If any bad boy do some good job he should be encouraged. If that bad boy becomes good he can be an example to other bad boys also. In fact the teacher (here Beall) also can be proud to transform a bad boy to good one.

Akbar Khan (from India)

As far as I’m aware, while DOAJ has indeed taken on the task of purging predatory journals from its list, it has not taken on the task of identifying predatory journals. Theirs is a whitelist, whereas Beall’s List is a blacklist. Both functions are (I believe) important, but they are very different. I think we need both — but I wish Beall’s List were managed in a more transparent way.

Not sure I agree Rick. (Blacklists by librarians have a repugnant history.) I have studied Beall’s writings and his list and it is a bit of a mess. Perhaps he should stick to blogging about these issues and publications. Lists have a permanency which blog articles do not have.

I’m not aware of the repugnant history of librarian-maintained blacklists. Can you share any details?

Also, I guess I might as well point out that “blogging about these issues and publications” is exactly what Beall currently does. His list is a blog. As for what you suggest is the essential permanency of lists, it’s illusory. There doesn’t need to be–indeed, should not be–anything ineradicably permanent about Beall’s list; one of the things that he ought to be doing with it is changing its content as needed, correcting it where he has made mistakes, removing publishers who mend their ways, etc. (I’ll be writing a posting on this very topic shortly, for what it’s worth.)

I think Beall’s list is already doing permanent effect. I came to know that the Ministry of Higher Education of Nigeria copied Bealls list and made it their own black list. Pakistan’s commission of Higher education also copied Beall’s list and made their own black list. A reputed university (Pune university) of India similarly copied Beall’s list and placed it as their official blacklist. So forget OASPA, DOAJ or COPE. Now Bealls list with all its limitations becomes official Govt list. It is really a proud moment for Beall that a single persons work is becoming the Govt policy of many countries. These ministry/department/commission etc. of Higher education are very rigid and stagnant type organization. Once they adopt something, they are not going to change it. Even if tomorrow, Bealls blog post removes some names due to good work, these ministries are very lazy to update their official Govt orders. I am really concerned with the fate of some small publishers as I recollect the comments of Phil Davis: “And while Bohannon reports that Beall was good at spotting publishers with poor quality control (82% of publishers on his list accepted the manuscript). That means that Beall is falsely accusing nearly one in five as being a ”potential, possible, or probable predatory scholarly open access publisher” on appearances alone.” (

Try a Google search on library censorship. When I was growing up it was all about Arthur Miller and Peyton Place.

Beall’s blog and his list are two different things. Surely you can see this.

Ah, you’re equating blacklisting (of the type we’re discussing here) with censorship. They’re somewhat related, but not the same thing. Calling out a publisher’s business practices as predatory is not the same thing as censorship.

As for Beall’s blog and his list: they are one and the same. Go to and look at it: it’s a blog–published on the same platform as the Scholarly Kitchen–and the content of that blog is his list. (He may have another blog as well, but what we’ve been discussing here is both a list and a blog.)

My point is that (at least in my ancient generation) librarians have a reputation for trying to impose sanctimonious standards on their community. When I studied Beall’s writings, blog and list I concluded that this is what he is doing. For example, he is lumping dishonest publishers, who cheat authors, with vanity publishers, while the latter is a legitimate business. He is also targeting startup publishers and low budget publishers.

The difference between a blog and a blacklist seems obvious to me.

My point is that (at least in my ancient generation) librarians have a reputation for trying to impose sanctimonious standards on their community.

So now we’ve slid from the “repugnant history” of “blacklists by librarians” to librarians having a “reputation for trying to impose sanctimonious standards on their community.” The latter isn’t quite as inflammatory an accusation, but I’d still appreciate a concrete example of what you’re talking about. (And this time, please don’t respond by giving me the assignment to come up with your examples for you. That’s not how intellectually honest discourse works. If you accuse a group of people of doing something bad, coming up with supporting evidence is your job, not theirs.)

I am not accusing anyone of anything, Rick. I am merely reporting an iconic reputation that was common when I was young. Do you dispute this report? Mind you this was a time of social revolution, or at least we thought so at the time.

Nor do I see a slide. The point is that librarians should not be telling us who is good and who is bad, as Beall is.

I am not accusing anyone of anything, Rick. I am merely reporting an iconic reputation that was common when I was young.

Sorry, but that’s not true. Your exact words were “Blacklists by librarians have a repugnant history.” That is a straight-up accusation that my profession has a history of doing something repugnant. Feel free to retract that statement if you’d like, or to substantiate it, but denying that it’s what you said won’t wash.

Good point Rick. If I change it from repugnant history to repugnant reputation, will that do? I do not personally know that the reputation was earned and suspect it was not. Social movements are like that, demonization-wise.

I want ask some questions to Lars Bjørnshauge
I have some questions about the new approach of DOAJ for the evaluation of journals. 2nd version of the ‘best practice’ has been introduced in July 2015. Along with the need for more detail was a call for more transparency, added Bjørnshauge ( Yes transparency is the main demand during evaluation. Beall completely failed in this area. Now the way DOAJ is operating, I fear that it is also going to fail. Many people complained that Beall was completely subjective during evaluation. I fear that DOAJ is also going in the same direction. I got many forwarded message from OA publishers, where DOAJ rejected an application as: “Your journal has been suggested for inclusion in DOAJ (Directory of Open Access Journals) and I have recently evaluated it.
I’m writing to notice you that I have had to reject your journal, this since it does not adhere to DOAJs best practice.

Where is the transparency? Nothing. The mail does not hint anything about the weakness of the journal. The mail does not give any suggestion. It is self-contradiction of DOAJ’s own policy as mentioned here: I suspect DOAJ is just matching the name of the publisher from Beall’s list. If the name is is present in Beall’s list, it is just rejecting it with such subjective evaluation result. More is expected from DOAJ. Please don’t repeat the mistakes of Beall.

I’ve been offering detailed criticism and statistical analysis of Beall’s list of “publishers,” but my stuff doesn’t appear in LJ. (For the record, I regard SciELO and Redalyc as first-rate platforms.)

Last year, the then-head of CAPES, Jorge Almeida Guimarães, published an article in a predatory journal published by the so-called Canadian Center of Science and Education, a publisher known for easy acceptance and included on my list. The article is here:

If SciELO is so wonderful, why is a president of Brazil’s largest higher education funding agency publishing his work in a questionable, non-SciELO journal? Why do thousands of other Latin American researchers also abandon SciELO and publish in commercial publishers’ journals?

Why did the Brazilian government invite six major commercial scholarly publishers to visit last year and make proposals on hosting Brazilian journals (a plan later thwarted by Brazil’s protectionist law of tenders and contracts)?

Note that my blog post merely asked a question, and the evidence shared so far has been helpful. One commenter remarked that the largest library organization devoted to acquiring Latin American library materials, SALALM, is largely uninterested in Latin American science. I think North American academic libraries prefer to collect materials such as those that support the “perfection” of the Bolivarian Revolution, ignoring hard science from Latin America.

I am not deterred by the thought police such as Jones or Virkar-Yates. They both work for for-profit companies that give strong lip service to OA, as they reap profits selling proprietary content, a clever business strategy.

You misrepresent the business I am in and avoid the nature of my concerns.

For the record Semantico does not sell “proprietary content”, we sell content delivery and access management solutions. As vendors to the scholarly community we take a neutral stance towards OA and work to support all the business models that our publisher clients require. In point of fact, the vast majority of our clients operate a subscription model.

My issue with you has always been the inflammatory and derogatory language you deploy in seeking to further your agenda. You have sidestepped this criticism in your comment above. The fact that you consider it an act of the ‘thought police’ to call you out for using the language you do simply proves my point.

Jeffrey, one of my problems with your list is that I do not understand how easy acceptance is predatory. Who is the prey, certainly not the authors? In my book your concept of predatory is hopelessly confused. It is just whatever you do not like.

Hello Jeffrey,

Thanks for commenting on the blog. Having said that, I don’t feel that ad hominem attacks are very helpful. In my mind, somebody acting like the thought police would say that you don’t have a right to write the things that you wrote in your blog. I would never say that. Instead, I was explaining why I think you’re wrong about SciELO.

I don’t know the ins and outs of why the Brazilian government invited commercial publishers to bid on running some journals. I don’t know enough about the situation but based on the original claims on your blog, the reason it was halted is because excluding Brazilian companies was deemed unlawful. On the face of it, that doesn’t sound like protectionism. Either way, that has nothing do with whether SciELO is good for Latin American science.

You asked why some academics choose to publish in places other than SciELO. That seems like an odd question. Academics publish in a range of venues. There are many options and so choosing not to publish with a particular platform every time is hardly a condemnation.

Similarly, what the fact that the head of CAPES published in a predatory journal that isn’t associated with SciELO has to do with SciELO, I really don’t know. I’m not sure what argument you’re trying to make there.

I have to say that your thoughts here seem quite confused. In my post, I set out my reasons for thinking that you’re wrong about SciELO. You argued that SciELO is a ‘Favela’ because it’s not in Scopus or Web of Science. Factually, that’s incorrect on both counts. I go on to say that local publishing is important to maintain a locally relevant research base.

I think that maintaining local excellence is an important feature of scholarly communication and should be supported, which is one reason that I am in favor of projects like SciELO. In my mind, it has parallels with library publishing and university presses as a way to maintain niche areas of scholarship or research of local interest.

Mr. Beall’s answer is really interesting because, once more, it reveals his biases, and it demonstrates his superficial understanding of the Brazilian situation.

1. The first thing to know is that Jorge Almeida Guimarães is not exactly friendly with SciELO. This situation derives from both both personal feelings and from the politics between Sao Paulo State and the Federal Government. If Guimarães publishes in a suspicious, possibly predatory, journal, it reflects mainly on him, not SciELO.

2. The assertion that “thousands of Latin American researchers abandon SciELO” needs to be proved. Did they ever publish in SciELO journals? And, even if they publish outside SciELO journals, does it mean that they do not want to publish in SciELO journals?

In some countries, moreover, researchers are incited to publish in “international” journals (e.g. money bonuses) and the systematic use of the impact factor in evaluations may also go a long way explaining why some researchers may prefer to publish elsewhere.

Finally, Mr. Beall regularly conflates commercial journals with good journals. Quite a few, non-commercial, journals are high-quality. Think of the Public Library of Science, for example. Think of many journals published by scientific associations.

3. The Brazilian government did not directly invite six publishing multinationals; it was CAPES, a branch of the Ministry of Education, that did that, and its President then was … Jorge Almeida Guimarães.

4. Brazil’s laws of tenders and contracts is Brazil’s law. Characterizing it as “protectionist” may be correct, but so what? Is protectionism inherently bad?

Jean-Claude Guédon

5. The comments in the last two paragraphs are essentially out of bound.

I am based in South America and am fully acquainted with the SciELO project. As in many spheres of life, it has light and darkness.

SciELO in fact was founded in 1997 as a publicly funded meta-publisher for the region. It prospered importantly because it was then mission-driven. The purpose was to create a common internet platform for learned society journals to be published, as they did not have the know-how or the resources to do this on a stand-alone basis. In this sense, SciELO has completely fulfilled its purpose.

However, the downside is equally important.

SciELO overrides journal editorial policy because when journals apply, they must comply with specific article type proportions – research articles are preferred over review articles, to mention one imposition.

SciELO has a very primitive user interface and no comments are allowed, among other shortcomings.

SciELO does not add publisher value, even though it is a meta-publisher.

Because it is free to journals, SciELO has had a negative externality in that it has made journal owners reluctant to invest in modern online publishing systems, even when open source (such as OJS).

In my view, the main problem with SciELO is that academic institutions think it is a database the likes of MEDLINE, which it is not. It is more like PubMed Central. And yet, academics are required to publish their research in journals with Web of Science Impact Factor or in SciELO. But when systematic reviews are undertaken, SciELO is not listed as a database, while the more comprehensive reviews will look at LILACS, the true regional database.

I believe that SciELO will continue to exist and thrive so long as public research funders and academic institutions keep on considering it as a metric of academic performance. It is also a reflection of the low investment in science and technology in the region (generally less than half a GDP percent point) as it contains a huge proportion of methodologically flawed articles and poor reporting standards. But no one is yet willing to bell the cat and call SciELO for what it really is, and not what authorities imagine it might be.

The downside you mention — upholding minimum editorial standards — is an upside in my opinion. I appreciate that SciELO will bar entry or de-list journals that, e.g., don’t follow their retractions guidelines. Check “Criteria for admission and permanence of scientific journals on SciELO collection.”

I agree their web interface is clumsy at times, and a mobile version is much needed. But I found their support staff responsive in fixing any typos or inconsistencies when I’d take the time to notify them via email.

I’m not sure what you mean by added publisher value. They do offer things like ahead-of-print (online-first) versions, all the XML interoperability wizardy, etc.

In my opinion the main valid criticism is directed towards their editors and authors, because of the so-called endogenous publishing practices, in which local journals will publish mainly local authors. Perhaps more South-to-South interaction (e.g., Chinese authors in Brazilian journals and vice versa) would help.

Finally, the comparison with PMC is not entirely apt, as SciELO and Redalyc are not just journal aggregators, they are more like co-operative or meta-publishers. Using CrossRef lingo, DOI resolution is one way for publishing organizations to give their imprimatur to the Definitive Work. So if you resolve a DOI for an article mirrored in PMC (or JSTOR), it will actually point to the original publisher’s website — which will be SciELO. SciELO’s role would be a little easier to explain if the sponsoring university departments or learned societies were to stop running their own OJS sites in parallel, although I suspect that is an uphill political battle over perceived journal ownership.

Thank you Felipe for your reply.

I do not believe that SciELO upholds editorial standards. I said that they override journal editorial policy, which is quite different. Say I wish to begin a journal on clinical practice and look to publish narrative reviews, I would not be accepted into SciELO because my proportion of original research would be nonexistent. This is not a minimum editorial standard. In my view it is downright foolishness.

What I mean by added publisher value is, for example, copyediting, being able to comment articles, proper article metadata, article level metrics, and, of course, nicer layouts, just to name a few. (Other posts in this site have written very nicely on what publishers offer as added value to original off-the-keyboard manuscripts).

I actually think that it would be best for learned societies to publish their journals on platforms such as OJS and not incur in duplicate publications.

And last but not least, once a journal has been accepted for SciELO listing, it can publish pretty much whatever they please and the original requirements vanish into the blue! I have never heard of a journal being de-listed from SciELO unless it stops coming out altogether, but then again, I may be wrong on that count.

If each little learned society or uni dept ran their own OJS as you propose, there wouldn’t be the scale necessary for developing these much appreciated value-added services, such as crucial preservation, indexing, DOI, XML, etc., besides other convenient features. That’s true for commercial publishers (for-profit and “for-surplus” non-profits) and it should be true for non-commercial co-operative or library publishing initiatives. Even PKP is leaning more and more towards online journal hosting services, which brings it one step closer to becoming kind of a meta-publisher if you will. (PS: I’ll let you know if I come across a journal that went through SciELO’s exclusion-appeals-readmission process.)

I also feel Beall’s list is completely unfair. There is no rigorous assessment nor transparent appeal procedure. It relies on Mr J Beall’s only decision. I can not believe how such a primitive “list” has gain so much weight in the scholarly publishing community.
In this particular case, Scielo does not seeks profit, and therefore they should not be labeled as “predatory” in any case.

One very important reason that Beall’s List has gained so much weight and influence in the scholarly community is that no one else has taken the initiative to do what he is doing. I agree with those who criticize the lack of transparency and rigor in Beall’s approach, and I wish (for example) that he would indicate both the specific reasons for each inclusion and indicate what a publisher would need to do in order to be taken off the list. But until the scholarly publishing industry does a better job of monitoring and calling out the predators and scam operators working within its borders, we’re going to have to be satisfied with the efforts of those outside the industry. Pointing out the shortcomings of Beall’s List is all well and good, but it’s easy. What’s hard is actually doing it better.

I simply want to second Rick’s comment. Beall provides a useful, if imperfect service. If the imperfections are so glaring, let’s see to it that the List gets superseded soon.

I continue to wonder about this: since when have blacklists become so desirable? (I’m sure the Catholic Church of ages past and a certain now-dead Senator would love to know…) Also, will there be a blacklist for subscription publishers as well? If criteria for that include republishing articles without credit and creating phony journals, well, I can think of the first two publishers on the list… but I’m guessing SSP and AAP wouldn’t sanction such a list.

I totally agree. Anyone can create “his or her own black list” on whatever and make it public, but giving credit to that list is a question of the community. I vote for giving zero credit to Beall’s list. I prefer not having anything to have such unipersonal and totally biased list.

In the scholarly publishing world, blacklists are desirable to the degree that they accurately identify genuine predators or scam operations, and to the degree that they are maintained fairly and transparently. As I’ve said before, I would welcome a blacklist that meets the above criteria and includes subscription journals. Republishing articles without credit and creating phony journals would certainly, to my mind, be among the legitimate grounds for inclusion on such a list. (Though the rules of transparency would require a clear and public definition of “phony journal.”)

I have no idea whether SSP or AAP would sanction such a list, but since neither of those organizations sanctions Beall’s List, that seems like a moot point to me.

We may be crossing that threshold of minimal fairness and transparency. I used to support Beall’s effort, but not at any cost. How much collateral damage is tolerable? That’s an ethical issue when aiming at a legitimate target (predatory journals), even if you’re nowhere near the line of fire.

I’d like to flip the guilt-by-association risk alluded to in the post and recall that Beall only gained wider notability after he was legitimized through news pieces published in Science and Nature; by the time these stories got picked up by the NYT, he had become a verifiably reliable source. Previously Beall had authored a couple of more balanced articles in library-science journals and popular-science magazines. Then with his list/blog sanctioned, it seemed restraint flew out the window, editorial and peer-review a thing of the past, and accusations without evidence became acceptable.

To recall some of the early context, it must have been shocking back in 2009 to find journals accepting SCIgen papers, and it was criminal for new publishers to steal intellectual property by republishing papers without permission. I can also see how Beall could have been genuinely worried about the science/pseudoscience demarcation getting blurred, and with seemingly imposed OA mandates affronting on authors’ freedom of choice.

But now we’re experiencing the unintended consequences of unleashing a self-appointed watchdog. In no uncertain terms, Beall is seeking to damage the reputation of non-commercial indigenous publishing efforts in favor of international commercial publishers, without any cost/benefit analysis and in a very ethnocentristic and marginalizing manner. Would Science & Nature have quoted Beall if they knew of his ensuing lack of restraint and biased views, that have since surfaced and piled up over the years? For the predatory scare not to be misinterpreted as media sensationalism, these publishers ought to clarify and delineate more carefully what exactly they endorsed, perhaps repudiating Beall’s latest stance or issuing an expression of concern over the sources they had originally interviewed. We can only hope those who let the genie out of the bottle and chose to elect Beall an authority to own their responsibility.

As for Beall, his credibility is getting undermined, especially among those who perhaps needed to pay the most attention. The Brazilian journalist @Tuffani remarked well recently, saying that this incident has only served as munition for the true predatory journals.

At COPE, where we have more than 10,000 members from across the world, we also take the position that blacklists are unhelpful. Hence, we have collaborated with DOAJ, OASPA, and WAME to produce a list of principles of transparency (recently revised) by which journals and publishers should be judged and which we use when journals and publishers apply for membership. Anyone is welcome to use the list and we welcome feedback on it.

Sorry about being a bit late to the party on this one. I’m in the wilds of the Scottish Highlands, writing this on my phone having found the only wifi hotspot in a hundred miles. Excuse the typos.

There’s certainly a lot of healthy debate here. What concerns me a bit is that there is an important conversation to be had arround predation, emerging markets, local excellence and information inequality. We need, as a community to also be talking about other predatory activities, like predatory conferences and author services. Insteaad, the conversation always seems to get sidetracked and we end up debating Jeffrey Beall and his list.

This worries me because it’s preventing progress. I’ve said before that Jeffrey Beall divisive and the tone of the debate quickly gets unhelpful. This comments section is a perfect example. In my post, I didn’t talk about the list. SciELO aren’t even on the list, but a casual reader might be forgiven for thinking that they are. There’s a risk if everyone from an emerging market being tarred with the same brush.

Historically the major commercial publishers had little interest in publishing Latin American journals. This was one of the main reasons for the creation of SciELO: To create a platform to give visibility to the “hidden” science from this region. More recently when these publishers began to see the economic opportunities in Latin America and became interested in acquiring or publishing established journals, SciELO became a major obstacle. Most of the major journals are generally happy with the services provided by SciELO and see little advantage in moving to another platform. The blog by Beall should be seen in this context. In fact the post contains so many inaccuracies and omissions that one can legitimately ask in whose interest was this blog published: certainly not scholarly publishing.

I also believe that the creation of blacklists without due process is morally perilous and academic research would be much better served by an impersonal list of recommended journals backed by an academic organization with transparent inclusion and exclusion criteria.

Thanks Hooman,

I’ve been reading your comments on Jeffrey Beall’s blog and one idea that I find interesting is that SciELO isn’t just a website but provides a range of services to publishers like providing proper meta data, DOIs etc.

While 20 years ago, commercial publishers didn’t see Latin American science as a profitable area, they’re taking more of an interest these days.

To what extent do you think SciELO’s work in helping publishers adopt good workflows has led to an increase is perceived value of Latin American journals and the current increase in commercial interest?

In my limited experience when the commercial publishers make an offer to established journals, they usually leave the editorial functions with the society or institution and take over the production workflow and provide a manuscript tracking system. I therefore do not think they attach great value to these latter functions that SciELO provides.

It seems to me that the commercial interest in Latin American journals is in the reputation and impact of the established journals especially the multidisciplinary and broad subject matter journals. It is easier for commercial publishers to launch niche journals but launching a new regional or national multidisciplinary journal ( and obviously I am not talking about predatory journals here) which will obtain widespread recognition and reputation is much more difficult hence the value of the established journals. In this aspect I think it would be difficult to quantify what is the contribution of the SciELO platform and what is the result of the work of the editorial team and the support of the hosting institution or society.

INASP supports local journals platforms in Sri Lanka, Nepal, Bangladesh, Mongolia Latin America and This week we are bringing editors and administrators together to discuss some of the key aspects of local journals publishing in the global south. We also work closely with African Journal Online. Between them, these platforms host more than 820 journals which receive downloads from all over the world.
Why do they matter? Because they publish a great deal of research which is relevant to their countries of origin which would struggle to find a place in northern journals. Think fisheries management in Bangladesh or farmers’ responses to climate change in Malawi. And also because journals are an essential part of a national research “ecosystem”. Journals play a key role in setting standards for research quality. Acting as a referee or an editor is a longstanding way for a researcher in Europe and the US to get broader exposure to what is happening in her/his field and to build the skills to critically evaluate research. These things are no less important in the south.
Some of the journals are well run and compare perfectly well with equivalents in the north. Others, much less so. All are operating in as yet immature research ecosystems where higher education is finally making heroic strides forward after decades of underinvestment. It is in the best interests of local researchers to have a choice about where they publish. But in a world where the rules of the global scientific community are set in the north, it can often be really hard to tell which journals are good quality and which are not. Beall’s list is not perfect. But it has played a crucial role in identifying some unscrupulous players who are seeking to make money out of those who understandably find it hardest to tell the difference.

Quick fact-check:

Phil – excellent post continuing your discussion of international publishing and information inequality – but I need to offer a couple of factual corrections:

You say “According to a post on the SciELO blog by the Brazilian Forum of Public Health Journal Editors and the Associação Brasileira de Saúde Coletiva, the entire SciELO catalogue is currently indexed in SCOPUS.”
In fact, about 80% of the SciELO journals have been selected for and covered in Scopus. The selection process for coverage begins with internal review of the basic, objective publishing and production standards of the journals, which are then submitted for review to an external board. (See:

Neither is it the case that all journals published by SciELO are included in the Web of Science version of SciELO (which is not “core”). SciELO claims 1245 current titles; Their materials say, variously, 650 or 700 or “over 700” and I could not find a list. TR’s Master Journal list does not show whether or not a journal is covered in SciELO – only if it is covered in one of the other, historical TR products. Finally – SciELO is not part of Web of Science Core Collection. (see; ).

Marie is correct. SciELO is not part of the Web of Science Core Collection, It is available for free to all subscribers to the Web of Science Platform. The Web of Science version of SciELO currently covers 980 of the 1245 titles listed on the SciELO web site. Any Web of Science user can export of list of all the SciELO titles covered by searching all years of SciELO data and “Analyzing” by Source Title.

Thanks both for the fact check.

I see where the confusion comes in with Web of Science. I had seen a number of library web pages that listed the SciELO index as part of the core, and that’s why I thought it was. It seems that while it technically isn’t, WoS subscribers get access to it for free, so it’s probably just esier to list it as core on your libguide.

My assumption about Scopus coverage came from a guest post on the SciELO that was possibly translated from Portuguese.

If Don’s numbers are right for WoS, SciELO coverage is about 80%. If Scopus also covers about 80%, that’s actually pretty good and compares very well with many of the larger commercial publishers.

I think the point still stands that SciELO content is very discoverable.

Mr. Beall’s post is by far the hottest SciELO-related post ever! Not just in terms of number of replies but, mainly by it’s geographical reach.
My congratulations to Phill for very concisely pointing the wrong and the correct arguments (yes! Mr. Beall is correct to state that SciELO has a very poor brand visibility among the North American and European scientific community) used in the original post and for everyone that commented, replied, twitted, repudiated, or somehow helped to viralize the subject.
Isn’t visibility marketing one of the main Latin American’s science problems? Mr. Beall did a great job by using his reputation and in favor of SciELO and RedAlyc. Thanks, and let’s keep it rolling!

In this hurly-burly we have lost sight of how it started. Basically, Jeffrey Beall offered two simple theses: (1) SciELO and similar meta-publishers are hidden although OA and (2) this is due to their inability to ensure visibility at the level of their commercial counterparts.
Although I protested the second claim, by pointing to an exception (, J.B. is generally right. This is what really concerns me. And not because of SciELO, but because of a citation database I am editing (SCIndeks) and the other one I am developing (SEESAmE) together with some OA enthusiasts. Both share with SciELO the same mission and business model. We don’t compete with SciELO, we paddle in the same shaky boat in trouble waters of developing world.
The key point here is, as J.B. claimed and Vivian in her nicely balanced comment ( admitted by saying that SciELO “does not add publisher value, even though it is a meta‑publisher”. What kind of add values? Certainly not the stuff like online‑first or XML interoperability wizardy, which Felipe ( mentioned in defense of SciELO, but only those funcions that boost articles’ visibility, such as related records, cited by, and alike. Only such publisher value-adds influence visibility. Not visibility as discoverability, or searchability, or dowloadability (these are trivias), but visibility as easy and fast availability to the most likely potential users and citers. Those happen to be authors who search for papers of their interest mainly by looking at references, not keywords. Remember, both SciELO and SCindeks are full-text citation databases, not just repositories, and they are such for a good reason.
Functionalities boosting visibility via high information integration and connectivity are at the same time the most expensive items in the citation databases development and maintenance. Such add-ons are ever-growing nowadays. Commercial (meta)publishers are capable of producing them almost on a daily basis. Can non-commercial meta-publishers sustain this arms race in the long run?
It looks to me that SciELO gave up the race by joining WoS. The Chinese Citation Index did the same before. Korea Citation Index also followed, while Russians are on the way. This may be a reasonable solution in terms of compensation for lack of integrativity, but at what price? Can small meta-publishers afford contracts with Thomson Reuters or Elsevier? Would they be accepted? Is, strategically speaking, the partnership with global commercial meta-publishers the only way to survive, or should we ask for a horizontal, South-South integration? Is there a third option? Can anybody over there show the way?
What seems to be overlooked by OA community is that possible disappearance of small OA meta-publishers as local journal indexers and evaluators put the whole OA movement at risk. Who if not them can take care of journals under the radar of global but exclusive evaluation agents such as Thomson Reuters? Who other can monitor the endless space of world OA publishing to decontaminate it from ever-growing number of phony journals and more and more impertinent criminal practices? Whatever his motives are, J.B. is doing a wonderful job for OA’s real good, but as evident from here he is just a lonely rider on a slow horse, unwelcome in the town.

Comments are closed.