Yesterday we published several Chef reactions to the OSTP Policy Memo (a.k.a. the Nelson Memo). Today we present the rest!

What are your initial thoughts about the OSTP policy announcement?

View of the White House with the Washington Monument in the background

Todd Carpenter: A lot of attention will rightly be paid to the requirements in the Memo for releasing content immediately upon publication and to the focus on the release of supporting data. This guidance will certainly have an impact on the dissemination of scholarly outputs. Other Chefs will focus their attention on those issues.

I’d like to take a more practical in-the-weeds approach to some of the details for a moment. Just as posting a document to a website isn’t ‘Publishing’, requiring open content to be posted in a repository somewhere is not necessarily the same as it being integrated into the ecosystem of scientific literature. We’ve seen mixed results in the past when it comes to how content is shared in the ecosystem since the original 2013 guidance from the OSTP.  It was therefore refreshing to see the focus on the high-level details of distributing research outputs outlined as vital in the memo.

Therefore, I’d like to draw attention and appreciation to the sections focused on the infrastructure of scientific communications. The mundane world of persistent identifiers, metadata, and file structures play an important role in the Memo, predominately in the footnotes and the links to additional documentation. Embedded in this policy are pointers to the variety of technical details that will make a tremendous difference to the usability, traceability, and integrity of the open science ecosystem.

…the memo outlines that publications not only be made available, but made available “in formats that allow for machine-readability” that also allows for accessibility for the print-disabled.

For example, the OSTP Memo outlines that publications not only be made available, but made available “in formats that allow for machine-readability” that also allows for accessibility for the print-disabled. NISO’s JATS standard is called out as an appropriate format in wide use for this in this footnote (See Page 4, Section 3, a, iii and in footnote #5). It is not simply that a file with the scholarly output be distributed, but that its “semantic meaning” not be lost. Later in the memo, in the section on research integrity, the importance of transparency in identification of “authorship, funding, affiliations”, and other aspects of federal research is stated. This openness is best facilitated by a network of identifiers and metadata. The focus of section 4 (page 6 and the associated footnotes, 16, 17, and 18) highlights these important components of the guidance, specifically outlining the expectations for metadata and persistent identifiers (PIDs). These elements are also critical in gathering the necessary data to support impact assessment of this policy and the impact of scientific investments more generally.

These elements of this policy will make the ecosystem of mandated publicly accessible publications and open data easier to navigate, to find, and to utilize. Much of what constitutes the FAIR principles is centered around metadata and identification for discovery, use, and reuse. What is troubling from a systems perspective, however, is that these elements are not equally supported by a directive for funding, in the way that the public access provisions for content may be. Presuming that some level of support goes to the editing and distribution, it is concerning that these systems not be “left on the cutting room floor” in the race to cut fat out of the publication process as costs are driven lower. These infrastructure systems are not inexpensive to build, populate, and maintain once launched. The research ecosystem has a long and successful history of supporting newly created outputs, but a less stellar record when it comes to supporting those systems after the initial funding has been exhausted. Hopefully, this aspect of resourcing the infrastructure that supports open content will be supported in the same way that initial publication fees are expected to be. Perhaps the attention on these issues at the OSTP level will push national funding toward these systems in a more systematic way, rather than relying on the publishing ecosystem as their primary funding source.

Angela Cochran: A few things came to mind with the announcement of the OSTP memo and more so with the economic impact report that came with it. First, there is recognition that peer review is highly valued. If it weren’t, this memo would just require preprinting. The problem is that there was no accompanying recognition that facilitating peer review is expensive. No matter what anyone says, most researchers will never just randomly spend their spare time looking for papers on preprint servers to review. And even if they did, this would be fraught with problems. Journals use expensive systems and staff resources to try an ensure that conflicts are mitigated and more and more, bias training, reviewer mentoring, and other training happens.

I won’t go into all the benefits of traditional peer review, but it seems OSTP gets it. They want the free public version to be the peer reviewed version — not the grant reports, not just the data, and not the submitted manuscript.

I won’t go into all the benefits of traditional peer review, but it seems OSTP gets it. They want the free public version to be the peer reviewed version — not the grant reports, not just the data, and not the submitted manuscript. So that presents a problem for journals operating under a subscription model, which are most of them. The only answer for preserving peer review as it is known today is to charge for it.

Second, as others have noted, the economic impact report was inadequate. It heavily relies on flawed data provided by lobbyists for open access (OA) policies. Article Processing Charge (APC) averages will not stay at $2,500-3,000. Not because the APCs will be artificially inflated, but because for highly selective journals, the APC model doesn’t work at that price point.

The OSTP report seems to give a nod to the technological development and infrastructure that they claim enables inexpensive online publishing without noting that those technologies and that infrastructure was paid for and continue to be financially supported by publishers. It’s worth thinking about which activities should be funded by which organizations in the future. The report specifically claims that plagiarism detection will be easier with no embargo which leads me to believe that OSTP doesn’t know that we all pay for plagiarism scanning within paywalls already. And we do it prior to publication.

…highly selective journals will have a hard time “flipping” to OA and allowing Green with no embargo is a non-starter. Many journals in this category are society journals and they invite a lot of content. Some of this content is the most read as it provides context and important commentary on the latest research. This is not content for which an APC can be collected…

Lastly, highly selective journals will have a hard time “flipping” to OA and allowing Green with no embargo is a non-starter. Many journals in this category are society journals and they invite a lot of content. Some of this content are the most-read articles in the journals, as they provide context and important commentary on the latest research. This is not content for which an APC can be collected and a lot of it is currently free to read. I foresee the continuation of a hybrid model whereby original research is OA and the rest goes back behind a paywall accessed via subscription. This is not guaranteed, as we don’t know if institutions will care enough about this content to pay a subscription for it. OA journals operate on a volume model for obvious reasons. This doesn’t work for selective journals. And while some OA publishers have kept somewhat selective OA journals afloat, they do so by supporting them through also having OA journals that are not very selective. Many societies differentiate themselves (fairly or not) from the commercial journals by promising high quality and highly-vetted impactful content. Having a less selective OA journal is a philosophical hurdle.

The effects of the pandemic, increasing labor costs, and continuing consolidation that is driving up publishing costs already has smaller and society publishers scrambling. The timing of this OSTP policy shift is going to be another tough pill to swallow and I have no doubt that some smaller organizations won’t make it.

Lisa Janicke Hinchliffe: Much will be said about this Memo in the coming months, and I’ve already said quite a bit on Twitter about zero embargo, and so today I’d like to focus on the section on “Ensuring Scientific and Research Integrity.” The recommendations here relate to metadata and persistent identifiers as well as related infrastructure. The minimum requirements for metadata to be collected and made available by the agencies are author and co-author names, affiliations, and sources of funding; date of publication; and a unique digital persistent identifier for the research output. Federally funded researchers are to be instructed to obtain a personal digital persistent identifier and agencies are to “assign unique digital persistent identifiers to all science research and development awards and intramural research protocols” in order to link funding agencies and awardees through the identifiers. Completed plans for implementing these provisions are due December 31, 2026, with effective dates no later than one year after publication of an agency’s plan.

What is less clear is how these metadata and identifiers are envisioned to ensure scientific and research integrity. Authorship information is rarely omitted from publications (though pseudonyms are occasionally used). It is not uncommon for a retracted article to have a DOI.

Scholarly Kitchen readers will be familiar with existing identity infrastructures that will likely feature prominently in the agency plans — DOIORCIDGRID, etc. — as well as the roles that publishers have played in standing up and maintaining these systems. We can anticipate that the agencies will be leaning heavily on publishers to provide the required metadata so that it can be collected and made publicly available in public access repositories. Improved metadata and identifiers will be particularly welcome by those who do bibliometric research studies as well as librarians using tools like Unsub to re-value their subscription spend in light of zero embargo availability of content. This may also renew publisher interest in Distributed Usage Logging in order to document article use.

What is less clear is how these metadata and identifiers are envisioned to ensure scientific and research integrity. Authorship information is rarely omitted from publications (though pseudonyms are occasionally used). It is not uncommon for a retracted article to have a DOI. This is not to say that such metadata and identifiers could not be useful in developing strategies to address integrity issues. But, per se, metadata and identifiers are descriptive not evaluative and the agencies have not been directed to use the metadata. The requirement is to collect and make it available publicly. Nonetheless, one can imagine that this publicly available data may encourage further development of integrity services and products.

Disclosure: Lisa is a member of the ORCID Board of Directors.

Karin Wulf: My first thought about the new OSTP memo is will this help?

The public desperately needs more research-based humanities – history, literature, art, and more. Anything that can get more humanities research into the public square, into conversation, and especially into policymaking, is great. We are deluged with evidence-free assertions about history, living through a world in which law and policy can be startlingly unmoored from the best and the most we know about the past — and often with dire consequences.

We are deluged with evidence-free assertions about history, living through a world in which law and policy can be startlingly unmoored from the best and the most we know about the past — and often with dire consequences.

Will the Nelson Memo help stem (ha, we must pun where we can) the tide of apathy, derision, and opposition to the importance of humanities research? Will it actually push more sorely needed humanities research out to where it’s needed?

So many of the issues and so much of the design around Open Access is about STEM. Think about this obvious point: it’s the White House Office of Science and Technology Policy that issued this memo about federal support for research. And we know that the humanities are a bare whisper of a hint of a suggestion compared to the full-throated holler of government support for STEM. The same is true of scholarly publishing, which is driven by the high dollar, high volume must-keep-up publishing that is nearly the opposite in many regards to humanities publishing, which is low dollar funded, slower to produce and to digest, intensely edited, and mostly non-profit.

Humanity needs the humanities. Not just for aesthetics or fun (those too, have value), but for democratic governance.

Humanity needs the humanities. Not just for aesthetics or fun (those too, have value), but for democratic governance. If you were, for example, facing a pandemic (or the threat of anti-democratic movements the world over, or the increasing concentration of wealth), it is supremely helpful not only to have vaccine science and public heath protocols, for example, but the cultural and social contexts in which vaccination and protocols will land. So, will this help? Is there even a concern for whether and how this can help? More to come.

Michael Clarke: The new OSTP policy is an ambitious and wide-ranging policy update. The memo expands the scope of the policy to include all agencies that fund research and scholarship, so will apply to not just to the biggest agencies (NIH, DOE, NSF, NASA, USDA and DOD) but also humanities work funded by the National Endowment of the Humanities. It also potentially expands the range of content it applies to, encompassing book chapters, editorials, and conference proceedings. Most notably, it includes immediate public access to the data sets reported on in published articles. This is potentially game changing for science, but there is very little detail provided in the memo – and no mention of funding to support what is likely to be a significant initiative.

The centerpiece of the policy, however, is the removal of the 12-month post-publication embargo that has been OSTP policy since the Holdren Memo was issued in 2013. The new OSTP policy appears positioned as a continuation of the “Green OA” policy outlined in the Holdren Memo. The implications of removing the post-publication embargo will likely be a shift not to Green OA, however, but to Gold (a scenario acknowledged in the Impact Statement that accompanies the Nelson Memo).

As my colleagues and I stated in a recent analysis of the OSTP policy and its implications, the scholarly publishing landscape is very different in 2022 than it was in 2013. There is vastly more content published OA due to Plan S, Transformative Agreements, and authors without OA mandates opting for Gold OA venues. “Born OA” publishers are flourishing. “Traditional” publishers have shifted almost the entirety of their portfolios to hybrid and are launching fully Gold OA titles at a steady clip. In addition, OA content (whether the author’s accepted manuscript or the version of record) are more discoverable than previously. Google Scholar and other indexes surface author accepted manuscripts. ResearchGate and Academia contain millions of author-provided article PDFs. Locating a freely accessible copy of a research paper is much easier today than a decade ago. And librarians are using more sophisticated tools, including UnSub and COUNTER5, to factor out OA content when evaluating journal subscriptions.

The OSTP policy update will accelerate these trends by making hundreds of thousands of additional papers freely accessible and readily discoverable. This will result in subscriptions to many journals becoming decreasingly viable. How long a window of subscription viability remains open depends on the details and decisions made at the agency level. If agencies require only the deposit of author accepted manuscripts in agency-designated repositories and do not require a liberal reuse license (e.g. CC BY), the window may remain open a decade or longer. If agencies require deposit or the version or record or liberal reuse licenses, then the era of the subscription may be ending much sooner for many journals.

Ann Michael: Plan S got our attention several years ago even though for many publishers, especially those that published mostly US-funded research, the impact of Plan S was going to be negligible. It was a directional indicator. It started a conversation that extended far beyond its potential impact on most publishers.

But OA was growing regardless.

The 2022 OSTP memo has now grabbed our attention again. While it doesn’t get prescriptive on license type or rights retention, it does include research funded by any federal agency, it extends that openness to data, and it includes metadata requirements (as several of the Chefs have pointed out) for tracking funding. For many publishers, its reach is far more impactful.

Two things have become clear. OA is the direction of travel and the OSTP just put its foot on the accelerator!

What are YOUR initial thoughts about the OSTP policy announcement? Please share them below in the comments.

Ann Michael

Ann Michael

Ann Michael is Chief Transformation Officer at AIP Publishing, leading the Data & Analytics, Product Innovation, Strategic Alignment Office, and Product Development and Operations teams. She also serves as Board Chair of Delta Think, a consultancy focused on strategy and innovation in scholarly communications. Throughout her career she has gained broad exposure to society and commercial scholarly publishers, librarians and library consortia, funders, and researchers. As an ardent believer in data informed decision-making, Ann was instrumental in the 2017 launch of the Delta Think Open Access Data & Analytics Tool, which tracks and assesses the impact of open access uptake and policies on the scholarly communications ecosystem. Additionally, Ann has served as Chief Digital Officer at PLOS, charged with driving execution and operations as well as their overall digital and supporting data strategy.

Todd A Carpenter

Todd A Carpenter

Todd Carpenter is Executive Director of the National Information Standards Organization (NISO). He additionally serves in a variety of leadership roles of a variety of organizations, including the ISO Technical Subcommittee on Identification & Description (ISO TC46/SC9), the Coalition for Seamless Access, and the Foundation of the Baltimore County Public Library.

Angela Cochran

Angela Cochran

Angela Cochran is Vice President of Publishing at the American Society of Clinical Oncology. She is past president of the Society for Scholarly Publishing and of the Council of Science Editors. Views on TSK are her own.

Lisa Janicke Hinchliffe

Lisa Janicke Hinchliffe

Lisa Janicke Hinchliffe is Professor/Coordinator for Research and Teaching Professional Development in the University Library and affiliate faculty in the School of Information Sciences and Center for Global Studies at the University of Illinois at Urbana-Champaign. lisahinchliffe.com

Karin Wulf

Karin Wulf

Karin Wulf is the Beatrice and Julio Mario Santo Domingo Director and Librarian at the John Carter Brown Library and Professor of History, Brown University. She is a historian with a research specialty in family, gender and politics in eighteenth-century British America and has experience in non-profit humanities publishing.

Michael Clarke

Michael Clarke

Michael Clarke is the Managing Partner at Clarke & Esposito, a boutique consulting firm focused on strategic issues related to professional and academic publishing and information services.

Discussion

7 Thoughts on "Ask The Chefs: OSTP Policy Part II"

“What is less clear is how these metadata and identifiers are envisioned to ensure scientific and research integrity” – Yes, metadata are necessary but not sufficient to foster research integrity.

In this context, metadata are essential because they enable verifiable attribution. The problem is that “authorship” is a too vague and outdated a concept to enable useful granularity.

This seems like an opportunity rather than a problem for scholarly societies and publishers.

It is for this reason that the CRediT effort, now managed by NISO, has received so much attention and support. Being more clear about the granularity of “authorship” in the context of science is extremely important. There will need to be additional maintenance work to continue expanding the terminology to embrace not only the papers, but the other outputs that researchers are developing.

Karin’s comment speaks eloquently–from the humanities and arts perspective–to creating a culture in which federally funded work both in this arena and STEM is available, respected and of service to their fields and to policy makers and consumers more broadly. Eager to hear more discussion about this, I’m taking her last sentence as a promise.

OA unquestionably helps researchers, especially those not in well-funded institutions. But I think we’re still in the early days of understanding the impact on the public square of easy direct access to research. Quite a bit of confusion was injected into the public conversation about COVID during the pandemic due to journalists reporting on the latest preprints, not to mention their use – and misuse – as “evidence” on social media. I think with both STEM and humanities research, journalism plays the key role in creating impact for the public. Good journalism requires synthesizing and interpreting, and responsibly reflecting the nuance inherent in existing knowledge on a given issue of public interest. It’s not amenable to “of the moment” journalism or bolstering a political case. The role is analogous to that played by publishers and peer review. It’s valuable yet increasingly economically disincentivized.

I’m still wondering if the bigger implications are over the raw data. For example, in the humanities you might want to keep some of your sources anonymous, but if the raw data contains enough identifiable information, what are the implications if it has to be public? It can’t be unsurmountable (I’m sure drug manufacturers deal with it all the time), but I’m sure it will worry some researchers until clearer instructions are released.

There are already fairly advanced and robust conversations about the privacy and security implications of sharing research data sets (there are three, at least, RDA working groups exploring this topic). It is recognized in the memo that not all data can be made publicly available for a variety of reasons. That said, whether there is robust sharing of experience and best practice across scholarly domains is certainly an open question. One hopes that these issues will be built into robust policies–did I skip over saying standards!–that are consistently advanced by each agency affected by this guidance.

Thanks for this great analysis in both Part 1 and 2. There’s lots to think about. One point of clarification – GRID is referenced but GRID has been retired and ROR has taken over – https://ror.org/blog/2022-03-17-first-independent-release/

I agree that metadata and identifiers have an important role to play (disclosure: I’m Executive Director of Crossref and on the Operations Working Group of ROR) – ROR, DataCite, ORCID and Crossref provide what’s needed to meet the requirements of the OSTP memo on the metadata and identifiers front but implementation will take close collaboration and coordination by all stakeholders.

Leave a Comment