Let’s say you’re a small but progressive non-profit research society with a journal. You believe strongly in the principles of open access (OA), and largely agree with the goals of cOAlition S. You are working diligently toward “flipping” your journal to OA from its current subscription business model, but realize that doing this in a sustainable manner may take some time (particularly as new business models beyond the APC are just emerging). You have a significant number of authors with Plan S funding, so the Transformative Journal route to compliance seems like a good path to keep those authors contributing to the journal while you make your OA transition.
But once you read the Transformative Journal reporting requirements, you will realize that this route is likely impossible for journals other than those from larger and wealthier publishers. Once again, a well-intentioned policy has created further inequities in scholarly communication.
Transformative Journals (TJs) are one route offered by cOAlition S “to encourage publishers to transition to immediate Open Access.” Through this route, a subscription/hybrid journal can remain compliant and eligible for Plan S authors by committing to a transition to becoming fully-OA and meeting a set of OA growth requirements each year until 2024, when support for TJs ends and they are expected to fully convert over to OA. Let’s ignore for now the OA growth requirements for TJs – DeltaThink’s recent analysis covers this well and shows how unrealistic the numbers are and how few journals are likely to progress adequately given the timelines involved.
Instead, I want to focus on the reporting requirements for TJs. Tallying up the number of OA articles published each year is easy to accomplish. The transparent pricing reporting requirements remain vague and meaningless enough that they shouldn’t prove too onerous for even smaller publishers to put together. Where things get difficult, if not impossible, is in the requirement for an annual public report to cOAlition S, a report that must include data on downloads, citations, and Altmetric scores for all papers published, and that must be sub-divided into OA papers versus non-OA papers.
For those working at larger publishing houses, this likely sounds trivial. You’d just assign your team of in-house bibliometric analysts to pull citation data from your expensive Web of Science, Scopus, or Dimensions subscription. Download information can be obtained from the usage tracking service you pay for, or perhaps it’s included from the full-service publishing platform that your organization owns or that you employ each year at significant cost. Altmetric numbers can come from your access to the paid service of the same name. Your employee bibliometricians will, of course, spend the necessary time parsing out the OA articles from everything else.
Hopefully the theme running through that last paragraph was fairly obvious – none of this is free, much of it is very expensive, and in-house bibliometric expertise is rare among smaller publishers. If you’re running a journal on a shoestring budget, then subscribing to any of the services above is probably beyond your means. Parsing out OA article data from non-OA articles is neither simple nor easy. Standards have been created for tagging articles to show OA status, but these are not widely in use and where they are, it’s generally the biggest publishers who have implemented them.
Having spent the last decade transitioning independent publishers into partnerships with a larger publisher, I can vouch that very few journals beyond the elite programs have this sort of infrastructure in place. Gaining access to bibliometric data and analysis is often a key motivator in signing a Publishing Services Agreement with a larger publisher. Left on their own, smaller journals and programs will struggle to provide the required level of reporting.
And so we are left with a policy where reporting compliance is largely out of reach for anyone other than the larger and wealthier portfolios. This seems to be a running theme with Plan S, as the Transformative Agreement route has been similarly plagued with the end result of driving consolidation and reinforcing the dominance of the biggest publishers. The non-Plan S Projekt DEAL seems to have suffered a similar fate. For those who were hoping Plan S would overthrow the status quo and usher in a new era of scholarly communication free from commercial incentives and constraints, these real world outcomes are probably disappointing.
Why require this reporting roadblock at all? It’s not clear what cOAlition S hopes to gain from these statistics, perhaps other than using them for promotional efforts to tout their own success. If so, it’s unclear what value they might offer for that purpose. Non-OA content is often made freely available, so controlling for the effects of free access is going to be impossible. We know that Altmetric scores are enormously influenced by the marketing efforts that publishers and authors put in place for individual articles, and is likely to be more affected by the size of an author’s network than the article’s access status (remember, Altmetrics don’t tell you whether someone has actually read the article they’re mentioning). Citation is a slow process – there’s a reason that metrics like the Impact Factor and CiteScore look at multi-year windows – so reporting on the small numbers of citations to an article in the same year that it is published will likely provide little useful information. Simply parsing out OA from non-OA content will also be skewed and uninformative as article types with little citation potential (editorials, book reviews, news stories, etc.) will likely all fall into the non-OA pool.
Going forward, it will be interesting to see how vigorously cOAlition S monitors and polices TJ compliance. It’s not going to look great if the majority of participating journals fail to meet requirements and get kicked out of the program right away. Hopefully, as has been the case so far, this aspect of Plan S will see further pragmatic updates as the unintended consequences it has spawned become evident.
In the meantime, for many hybrid journals, particularly those with significant numbers of Plan S-funded authors, the TJ route may still be an attractive option despite an inability to meet reporting requirements. Described elsewhere as a “yeah, why not” strategy, there seems to be no downside to signing up for TJ status. Even if the journal fails to meet growth targets or the publisher completely ignores reporting altogether, a year of additional compliance (and possibly more, depending on enforcement) is offered with seemingly no penalties for failure.
With nothing to lose, I expect to see more journals applying for Transformative status. Unfortunately, the success of the program is going to be greatly reduced by its overly complex and costly reporting requirements, and many journals who are very much in line with cOAlition S’s goals will be shut out.
6 Thoughts on "More Unintended Consequences: How the Plan S Transformative Journal Route Favors Larger Incumbent Publishers"
It’s certainly not ideal, but I’d just like to point out that your hypothetical small publisher probably has editors who also work at, or live within a short drive from, a university that has subscription access to one or more of those expensive metric services and allows walk-in patrons to use them. Covid makes “walk in” more problematic in some places but “S” was written before Covid, and maybe could be pressured to extend the deadline for compliance because of it.
Do most (working academic) editors have the expertise to go into Web of Science and extract the full data set required, as well as process it in the required manner? Do all universities uniformly subscribe to Altmetric? And the university isn’t going to help with usage data. Time is money, and this is asking a lot on top of what is probably already a volunteer position.
And still I ask, what is the point? This data is largely going to be meaningless as it does not control for confounding factors, so it will not lead to meaningful conclusions. This is an absolutely unnecessary hoop that journals must jump through that adds no value and damages what could be a useful initiative.
I agree. Just have to point out that most librarians are happy to help patrons use their licensed products, regardless of the use case, and absolutely if the person asking is faculty at their university. Don’t be shy about asking for help on that aspect.
I think this would make a great SK article if someone took a sample of editors and tested this.
FWIW, I do think the notion that the financial reporting shouldn’t be too hard is optimistic. I couldn’t have easily done it for the journal I edited recently. I’m not sure I could have done it at all honestly. There’s a lot of assumptions built into the reporting requirements.
Thanks for this article, David and the reference to Deltathink’s growth requirement analysis. I’m going to recommend it to the Cdn. Ass’n of Learned Journals listserv because the issues you raise so closely parallel the efforts of Canada’s Social Science and Humanities Research Council to push journals towards OA. The issues are only slightly different. SSHRC provides subsidies for OA journals — it used to be for all non-profit Canadian journals — at Can$1,050 per article and Can$850 if the journals uses APCs. Currently, it allows a maximum 12-month delay in open access. Obviously, both funding levels are insufficient for any journal except one running on volunteer time and underqualified student labour. By requiring OA after 12 months max, SSHRC has transformed subscriptions from access to current issues and archives to 12 months of early access to content. Such a transformation denigrates the service journals provide in contributing to the record of knowledge and building research communities. Thereby, it encourages subscriptions cancellations if only because such access is unjustifiable to a budget-cutting SSH librarian (at least outside Canada) who recognizes the “luxury” of immediacy to SSH content. (We have data showing a sustained maximum level of usage for the average article of about 12 years.) The result has been that many of Canada’s most reputable journals have walked away from SSHRC’s subsidy in order to preserve their market earnings which, from non-Canadian sources are greater than the maximum Can$30,000 grant SSHRC offers. Slightly smaller independent journals are being tempted by concentrated ownership in the form of partnering with a Sage or T&Y. This would inevitably increase costs to libraries but probably gain a journal global readership and better protect against subscription cancellations.
Interesting post David. I agree with you that this type of data collection has not much value in itself. According to the linked Plan S Excel sheet it is not even defined what kind of citation data (Web of Science, Scopus, Dimensions, Crossref, etc.) is needed in the reporting.
It is though relatively easy to obtain some of the data Plan S requires even without expensive subscriptions with the help of a DOI:
“Altmetric Attention Score” + “Dimensions citations” via the free Dimensions search page:
“Scopus citations” + “Crossref citations” via PlumX Analytics:
Obviously, it is pretty tedious to obtain data for many articles this way, but for a smaller transformative journal with a few dozens article a year it would be doable.