Editor’s Note: Today’s post is by Byron Russell, John Sack, Alison McGonagle-O’Connell, and Tony Alves. Byron is a publishing consultant with Woodstock Publishing Services. He brings over 30 years’ expertise in the industry garnered at Ingenta, Pearson, Berlitz and Macmillan. John is the Founding Director of HighWire, which facilitates the digital dissemination of more than 3000 journals, books, reference works, and proceedings. Alison is Senior Director of Marketing at HighWire. Previously, she has worked with industry leaders including Coko, Aries Systems, EBSCO Information Services, John Wiley & Sons, and Houghton Mifflin Company. Tony has worked in scholarly communications since 1990 and leads a team of product managers at Highwire Press, ranging from online platforms and content management systems to analytics tools to access and entitlement software, to e-commerce and order fulfillment systems. 

The number of preprint servers has increased substantially in the last five years and now stands at no less than sixty. More than thirty new servers have appeared in the past five years.  These servers are diverse, focusing on subdisciplines, or specific geographies, or specific languages, and have varying degrees of penetration and technical sophistication. Existing publishing services and workflows are now being reimagined to accommodate preprints. This essay examines how a publisher-centric approach simplifies workflows and speeds the process of peer review through preprint pre-assessment and the checks and balances being implemented by publishers and third-parties to build trust and confidence in preprints.

The depositing of preprints has accelerated even more during the COVID-19 pandemic. In medRxiv’s first 6 months, June -December 2019, 900 preprints were posted, and in its second, January – June 2020, the total was 6700, a 750% increase.

The scientific community published over 16,000 articles relating to COVID-19 within 4 months of the first reported case of the disease, and around 38% of these articles (>6000) were hosted on preprint servers. The papers and research findings were rapidly assimilated by the scientific community for the following reasons: access is free; access is easy; use and citation guided and fostered further study. For example, a research team from Tübingen University Hospital in Germany was involved in the focused effort to create an effective vaccine. One of their papers was first deposited to medRxiv. It logged 20 citations, 5,500 pdf downloads, 168,083 HTML page views, and an Altmetric score of over 3,700; all before being published in Nature Immunology. The preprint was even featured in 23 newspapers worldwide, including El Pais, The Telegraph, De Volkskrant, and Scientific American, demonstrating the impact potential of a preprint outside scholarly circles and into public media.

As Cold Spring Harbor Laboratory’s John Inglis commented at the 2021 SSP conference session ‘Preprints and the Editorial Work of Journals’ organized by HighWire, the goal of the bioRxiv preprint server is “the acceleration of research and providing unrestricted access to manuscripts before they enter an often-lengthy process of peer review.” The case of Covid has demonstrated this potential can be realized.

 

Still, some publishers, editors and authors feel there are open questions surrounding preprints. Does speed imply a lack of quality? Another is how will publishers integrate preprints into traditional publishing processes? After all, posting to a preprint server is often an independent endeavor undertaken by the authors themselves, selecting an appropriate server ahead of the processes of submission, peer review and final publication.

 

For a growing number of publishers, however, preprints are no longer viewed as peripheral or competitive to the traditional publishing process. Rather, the deposit and evaluation of relevant preprints have become an integrated part of the publishing workflow. The growing importance of speed to dissemination, together with publishers’ desire to encourage greater depth and breadth in terms of their article submissions, means many publishers are exploring parallel processes rather than sequential workflows. As discussion in the SSP session seemed to suggest, the depositing of preprints is being transformed from an individual initiative into a fully integrated part of the whole publishing process, and as a result, existing structures are being adapted to support this integration. New career paths and job roles are emerging within the scholarly publishing ecosystem and new services are observably being developed to handle preprints and content selection creatively. Preprint manuscript pre-assessment as well as other strategies are being deployed to foster trust in preprint servers as well as in preprints themselves.

Pre-assessing the Preprint

In order to build trust in preprint submissions and nurture authors in the publishing process, publishers and third-party service providers are using pre-assessments of preprints. Though not replacing formal peer review, such independent, informal commentary and evaluation supports both authors and editors in evaluating submissions.

As an example, the Company of Biologists (CoB) offers a service called preLights. Early-career researchers (‘preLighters’) participating in the program can personally select and spotlight what they consider to be interesting biology preprints on the bioRxiv server, the leading preprint server for biology (bioRxiv accepted over 38,000 manuscripts in 2020 alone). They then annotate these with blog posts, questions, and comments. These ‘preLight posts’ draw attention to new papers that are perceived as being of special interest or importance. PreLighters can interact directly with the authors of the paper, who are invited to respond to comments and questions that the preLighter has raised. Though not a formal peer review, this process lends transparency and further credence to the work of the authors concerned. The program gives the preLighters a step on to the first rungs of the publishing ladder as reviewers, and draws further attention to bioRxiv itself. To date, there have been over 1000 preLight posts over the past three years.

For authors requiring a more formal early review of their research paper, tools such as EMBO’s Review Commons provides a Refereed Preprint service. Launched in 2019, the concept is for a high quality, journal-independent peer review of a paper before it is submitted to either a preprint server (bioRxiv, currently) or a journal. Refereed Preprints are not intended to be a substitute for a formal publisher-led peer review. It is not intended for the service to replace publisher-led peer review, but may act as a guide for the author(s) prior to eventual submission. The service package covers a review of the authors’ manuscript, feedback from a single round of independent peer review and the authors’ responses. This model enables authors to make informed decisions about the most appropriate journal for subsequent submission, and can streamline the submission process itself.

Post-review, Review Commons takes on the administrative tasks of depositing the Refereed Preprint to bioRxiv and enabling submission to seventeen affiliate journals published by EMBO Press, eLife, ASCB, The Company of Biologists, Rockefeller University Press and PLOS. The editors of these journals have agreed to consider and assess the transferred reviews. This may mean that a full review process does not necessarily have to be started over, and potentially saves time for editors, thereby accelerating the time to publication and broader dissemination. Even if the editors decide to reject the paper, the authors can re-use the initial peer review evaluation for submission to another journal. In addition, both the independent reviews and the authors’ responses can be posted to bioRxiv through the server’s new Transparent Review in Preprints (TRiP) project, putting the paper itself in context and allowing for greater transparency in peer commentary prior to final publication.

Submission Reimagined

In order to fully integrate preprints into the submission process, publishers must review and overhaul the process itself. This involves a re-assessment of the importance of the preprint as part of the publishing process, and trust – as highlighted in the section above – is an important component. It also entails a re-alignment of internal systems, including staffing and administrative roles.

As an example of how roles may be re-aligned, in 2017 the Royal Society appointed its first Preprint Editor, Dr Maurine Neiman, to its flagship journal Proceedings B. Dr. Neiman’s role centers on soliciting submissions to the journal from papers already deposited in preprint servers – pre-selecting potential contributions from existing papers available through external sources, rather than from direct submissions. She employs a team of twenty graduate and post-doctoral researchers in the processes of selection and author communication, integrating the bioRxiv preprint server into the manuscript submission workflow, running parallel to the traditional submission-to-peer review acceptance process. The approach may be considered a formalized version of CoB’s PreLights service, with early career-stage researchers making an important contribution to the publishing process.

One of the goals of the project was not simply to increase the number of submissions by cherry-picking existing papers from preprint servers, but also to encourage researchers at the point of submission to consider Proceedings B, which they might not have contemplated previously. From this point of view, the preprint editorial project has been remarkably successful, and is being extended; in early 2020 the Royal Society journal Open Biology appointed Huddersfield’s Professor Michael Ginger as its first Preprint Editor.

Preprint Servers: No Longer A Missing Link

Two further case studies help demonstrate that preprints are now integrated across the whole publishing process. The first illustrates how PLOS is using parallel submissions and pre-assessment to speed up publication and increase trust in preprints themselves. The second describes how the CoB introduced a two-way integration of their manuscript submission systems, seamlessly linking submissions and preprints.

When Nonia Pariente was appointed Editor-in-Chief of PLOS Biology, their flagship Open Access journal in Life Sciences, she commented in an interview that thinking outside the box is “almost an obligation” for the Journal. The most recent innovation has been the introduction of direct linking between preprint server deposition and journal submission. From May 2018, authors submitting their manuscript to PLOS Biology and almost all other PLOS journals can also opt to post their article simultaneously on bioRxiv, PLOS having a working partnership with bioRxiv’s owner Cold Spring Harbor Laboratory.

For PLOS’s authors, this provides a valuable opportunity to make their latest findings available on a trusted platform before peer review, which can be a frustratingly long process. For bioRxiv itself, the link with PLOS increases manuscript deposits.

An important point to make is that depositing in bioRxiv and simultaneous submission to PLOS is not an automatic process. A key aspect is the pre-validation of each paper submitted. To ensure standards are maintained, in agreement with PLOS, bioRxiv performs an initial screening to cover points such as plagiarism check using Crossref’s iThenticate, article scope and ethical and technical considerations. Approximately 70% of all submissions pass and make it on to the bioRxiv server, though this does not automatically guarantee acceptance in a PLOS publication.

An additional benefit is that this early process of checks and balances helps to promote overall trust in bioRxiv-hosted preprints themselves. Evidence of such trust is the increasing number of preprints that go on to be published.

chart showing journals that publish biorxiv preprints
A stacked bar graph showing the 30 journals that have published the most bioRxiv preprints. The bars indicate the number of preprints published by each journal, broken down by the bioRxiv categories to which the preprints were originally posted

CoB has also built a direct-deposit workflow connection with bioRxiv. Back in 2016 they introduced a two-way integration of their manuscript submission systems. The system they use is BenchPress, a manuscript management tool developed by HighWire. Authors depositing a manuscript in bioRxiv can submit their paper directly to any of The CoBs’ journals (Development, Journal of Cell Science, Journal of Experimental Biology, Disease Models & Mechanisms and Biology Open) via a simple transfer portal, avoiding the need to re-enter information manually or upload files via multiple submission systems.

Authors submitting a manuscript to any CoB journal can also deposit their article in bioRxiv immediately. The BenchPress submission system employs a simple, integrated one-click link which enables the paper to be deposited directly into bioRxiv. The paper is directly citable with a bioRxiv DOI, and its usage and download metrics can be tracked, even while it is undergoing review for the journal.

The feedback from authors has been positive from the launch of the ‘direct-deposit’ initiative. The system encourages authors to submit to CoB Journals with the knowledge that their work will become immediately available – and citable – on bioRxiv before peer review and decision, and regardless of formal publication. There is minimal administration required on their part. This two-way link is mutually beneficial, and helps to smooth the path from preprint to eventual publication.

Conclusion

It is clear that major publishers, particularly in the biosciences, are working hard to broaden their editorial and submission workflows to incorporate preprints in parallel processing operations. Changes in peer review, governed by publishers and third-party service providers, appear to speed up the entire publication process and build additional trust in preprints themselves. In the case studies related above, it is apparent that changing established, “traditional” workflow structures to integrate preprints needs careful planning and creative, innovative thinking — including revaluation and re-allocation of staff roles.

Submission systems must be adapted to include direct linking to appropriate preprint servers, enabling the depositing of both manuscript and related metadata. Even after the paper has been deposited on the preprint server, there needs to be regular dialogue regarding the preprint and the version that may eventually find its way into the journal, with links created between the version for journal submission and the preprint for any subsequent revisions of both metadata and content.

There is an apparent dichotomy between the “need for speed” – never more apparent than in the rush to find the most effective treatments and preventative measures against COVID-19 – and the time-consuming thoroughness required by extensive peer review. Nonetheless, these case studies show that these requirements are not irreconcilable opposites. Publishers and independent peer-review platforms have demonstrated that they can modify the traditional submission workflow, accelerating the tempo of scholarly dissemination while simultaneously satisfying the essential requirement for comprehensive peer review for learned publishing.

Innovations such as those described put publishers into the driver’s seat, influencing and enabling the flow of preprints to preprint servers. This should lead to enhanced quality control and confidence in preprints. In the future, publishers themselves will enjoy greater involvement with the scholarly communication chain as they adapt internal processes to envelop the whole publishing arc, from preprint submission to eventual journal publication and beyond.

Byron Russell

Byron Russell is is a consultant with Woodstock Publishing Services. He brings over 30 years’ expertise in the industry garnered at Ingenta, Pearson, Berlitz and Macmillan.

John Sack

John Sack. John is the Founding Director of HighWire, which facilitates the digital dissemination of more than 3000 journals, books, reference works, and proceedings. John is also the Co-Director of the International Congress on Peer Review and Scientific Publication.

Alison McGonagle-O’Connell

Alison McGonagle-O’Connell is Senior Director of Marketing at HighWire. She has worked with industry leaders including Coko, Aries Systems, EBSCO Information Services, John Wiley & Sons, and Houghton Mifflin Company.

Tony Alves

Tony Alves has worked in scholarly communications since 1990 and leads a team of product managers at Highwire Press, ranging from online platforms and content management systems to analytics tools to access and entitlement software, to e-commerce and order fulfillment systems. 

Discussion

6 Thoughts on "Guest Post — Publishers Integrate Preprints Into Their Workflows"

It is good to see that members of the publishing industry, while carefully employing the term “preprint” rather than “preprint publication” (i.e. a “print” made public), are now feeling less threatened. However, services such as “preLights” seem to be reinventing a post-publication-review wheel that the NCBI sadly abandoned in 2018 as a “failed experiment” – PubMed Commons (PMC). Here, not only “early-career” researchers, but folk with long experience and high expertise could “personally select and spotlight” what they consider to be interesting biology preprint publications (as well as papers already formally peer-reviewed and published). It seems possible that PMC was too revolutionary, not only for the publishing industry but also for some researchers who, relaxing after their work had been formally reviewed and published in a high profile journal, suddenly encountered an extra layer of post-publication review by experts who had been overlooked in the earlier rounds of reviewing.

Given that the NLM’s rationale for shutting down PubMed Commons was a lack of interest and participation from the community (rather than researchers being upset with further attention being paid to their work), one might question the scalability of such schemes. If you figure 2.6M science and engineering papers annually (https://ncses.nsf.gov/pubs/nsb20206/), add in all the work that gets written up but rejected and not published, then add in all the non-science and non-engineering research being published, that’s an awful lot of material to cover. Or will future career advancement end up being about scoring a “glamor” preprint deemed significant enough to draw attention?

Yes, there is “an awful lot of material,” but much of it is not awful to the extent that it will provoke commentaries from specialists monitoring their particular fields. And the blooming of the preprint publication revolution that has the led Byron Russell et al. to invoke services such as preLights, is very much a post-2018 activity. By permitting rapid responses to misinformation (or even disinformation) it can be nipped in the bud. False memes that so easily gain traction are less likely to endure. A recent example that began in a journal concerned with molecular evolution in 1995 has only recently been scotched by a SSRN preprint publication that ended up in Genetica 149, 81-88.

A point of clarification about Review Commons: affiliate journals agree to make their assessment primarily on the existing reviews. This is reflected by our finding that no new reviewers were consulted in 98% of accepted papers and 86% of rejected papers (https://asapbio.org/review-commons-9-months). In that sense, Review Commons substitutes for the first round of review at a journal and integrates with journal publishing process, rather than simply serving as a guide for authors.

When you speak to scientists and researchers they don’t see “publishers” the place to publish Preprints or actually having a role in Preprints. They see publishers as the place to publish their finished work. They are quite happy to publish preprints on ArXiv or any other Preprint server and whereas a publisher or journal signposting to/publishing/integrating a Preprint might be “nice” or “cool” they don’t really care and it doesn’t change their behaviour or user need!

Leave a Comment