Peer Review Week 2024 celebrates a theme that is rapidly reshaping the academic publishing landscape: Innovation and Technology in Peer Review. As we look ahead, the tools and systems driving efficiency and quality in peer review are becoming increasingly sophisticated, from AI-powered manuscript checks to automated reviewer matching systems. But, as exciting as these innovations are, the real challenge lies not just in implementing the technology, but in managing the human and organizational changes required to make these innovations stick.

In this post, we go beyond the surface of technological advancements to explore what it truly takes to integrate these tools into the fabric of peer review processes. We’re talking about change management — the strategic, and sometimes complex, process of preparing teams, adjusting workflows, and aligning organizational culture to adapt to these innovations. Without careful management, even the most promising technologies can falter if the people behind them aren’t properly equipped to embrace the changes they bring.

We’ve invited three peer review managers from leading publishing groups to share their insights into how they are leading their teams through these transformative processes. Today we hear from Laurie Webby, Peer Review Manager at the American Society for Microbiology (ASM), Alison Denby, Vice President, Journals, at Oxford University Press (OUP), and my co-worker Erika Mann, Peer Review Operations Manager at the Public Library of Science (PLOS). From practical tools like checklists and plagiarism detection software, to fostering cross-departmental collaboration, their stories offer a roadmap for others navigating similar journeys. Through their perspectives, we’ll learn not only about the innovations themselves but also the critical strategies for ensuring these technologies are successfully adopted and embedded into everyday practice.

Five people stand on a set of parallel arrows that are reshuffling in different directions

How do you drive innovation in your everyday peer review practices?

Innovation takes many forms, from incremental improvements to sweeping transformations in peer review. For these peer review leaders, innovation is not just about adopting the latest tools but also about fostering a culture of adaptability and continuous improvement.

Erika: Incremental progress towards improving processes! At PLOS, there’s been a recent shift towards agility when it comes to improvement and problem-solving, which is a relatively new method of working for us. In the past, we worked under a more ‘traditional’, linear model of solutioning; that is, identifying a problem and solution, and then tracking whether the solution works once it’s already been implemented. It can be hard to adapt to new ways of working and convince people that it might lead to better outcomes, but it’s important to keep an open mind and be willing to take some risks. 

Laurie: I believe innovation can stem from many different sources — whether it’s from colleagues, editors, authors, reviewers, or even readers. I regularly audit our peer review workflow and processes to assess what is working well for our authors and what could be improved. My primary goal is to streamline the initial submission process at ASM, making it as quick and straightforward as possible. By focusing only on the essential requirements for review, we aim to ensure that authors receive timely decisions on their submissions.

What have you learned from introducing new peer review technologies?

Implementing new technologies often reveals both the strengths and weaknesses of current workflows. Our experts reflect on the unexpected challenges they faced when bringing innovations to their teams and the lessons learned through trial, error, and eventual success.

Alison: Experiment, experiment, experiment. There’s substantial scope to improve the act of peer review and, in doing so, enhance the value we’re adding to research for authors, and attempt to reduce some of the existing biases. You should also carefully consider the type of peer review work you’re experimenting with. Often, Generative AI based on LLMs (Large Language Models) is only one tool and often it isn’t the most effective solution for specific needs, particularly within our highly technical and specialist area.  Take for example tools designed to assist editorial staff with securing reviewers. Often the biggest challenge isn’t in finding the right reviewer – it’s in how you approach a suitable person at the right time, in the right way to persuade them to act as a reviewer. While technology is rapidly evolving and we’re adapting in response, the fundamental principles remain unchanged – high-quality work, carefully reviewed and enhanced, to maximize impact in its broadest senses. 

An even more specific example is the use of language editing tools before submission, to help level the playing field for authors for whom English is not a first language. We have seen a reduction in the rejection rate for content where a language editing tool has been used.

Erika: I’ve been doing some research on what other publishers are doing, specifically what tools they use to manage peer review and ensure that papers meet journal policies and expectations. From this research, I’ve learned that a lot of organizations are creating tools that leverage AI and emphasize real-time collaboration between contributors and the journal office. I find a lot of value in a more collaborative peer review process, both on the management side and for improving research, so I’m excited to see how those tools shape up and how we can use those learnings to improve processes at PLOS.

What tools, processes, or checklists have been most critical in managing these innovations?

From AI-based plagiarism detection to automated reviewer matching, peer review teams are increasingly leveraging new tools to streamline their workflows. Our experts share which technologies have been the most impactful, why they chose them, and how these tools have reshaped their processes.

Alison: At OUP, we have introduced a wide range of tools and processes. Our primary focus has been on a plug-and-play approach to pull together a combination of best-in-class capabilities that will continue to evolve as the technologies change.

Erika: Over the last three years, PLOS has launched a new suite of journals. With each launch came new ways the Peer Review Operations Team had to learn to work. Specifically, we introduced a new mechanism in our peer review management tool, for our vendor to more effectively escalate questions to PLOS staff. Rather than add staff to discussions, we use ‘Editorial Tasks’ in Editorial Manager, a feature that allows us to directly assign work to staff. This seemingly minor innovation has allowed for much-needed accountability in our workflows. Content doesn’t get lost and we can develop better reporting. With this new tool we can effectively track submission volumes and specific types of escalations, and additionally, we can adjust any of our workflows as needed. This seemingly minor innovation in our work, I’d argue, has been essential for keeping peer review timelines intact and contributing to a better author experience than our previous method. It’s also allowed staff to prioritize their work efficiently. 

Can you provide an example of an innovation or technology you’ve introduced and the hurdles you faced during its implementation?

Transformational change doesn’t come without its challenges. Each expert shares a real-world example of a tool or process they’ve implemented, highlighting the key hurdles they encountered during the change management process and how they overcame them.

Alison: At OUP, we’ve tested and are currently using many different tools to support multiple stages of the peer review process:

  • Integrity detection tools: AI in peer review from language editing and testing other things like “fact-checking” e.g., does a lab have the equipment to undertake the research presented?
  •  Journal audits and checklists – understanding the nuance between disciplines – what’s an acceptable editorial practice in one field might be flagged as a concern in another – tools need to be appropriately tailored
  •  Editor education and training
  •  Persistent Identifiers (PIDs) and things like CRediT – for author accountability and understanding the funder/institution – improved metadata around all content. These identifiers help to flag potentially problematic content but also help to make connections – e.g., understanding if an author has an APC discount as a society member, or as part of a Read and Publish (R&P) deals
  • Collecting information on data availability

The biggest challenge we have encountered is bringing the whole community around a journal, (authors, reviewers, editors, and society partners) along with us in the face of a lot of change. Many changes add work for an editorial team. Where possible, we try to remove the administrative work (particularly in areas where subject matter expertise is not critical) away from our editors so they can focus on content.

Laurie: One improvement we implemented across all our journals was the automatic review assignment system. I made this decision because previously, only half of our journals were utilizing it, while the other half were not. By standardizing this feature, we aimed to relieve our editors from the daily task of manually checking on reviewer availability, as the system automatically assigns a new reviewer if one declines. This also helps prevent overburdening our reviewer pool with unnecessary emails, particularly in situations where they are invited to review but later informed that their participation is no longer needed.

The most challenging aspect has been gaining full support from the editors. Some feel that this process takes longer, although data shows that journals using the system correctly have comparable turnaround times to those that do not. However, overcoming this perception has been a persistent challenge.

Have you experienced cross-collaboration with teams or departments that typically don’t intersect with peer review?

The intersection of peer review with other departments and external collaborators is increasingly necessary for successful innovation. Whether working with IT, data science teams, or even external vendors, collaboration has become critical to scaling innovation across organizations. Our contributors reflect on their experiences forming new partnerships and how their teams adapted to these evolving changes.

Laurie:  Since the launch of journal portfolios — which in itself is an innovative new business model ASM introduced to better position our journals for a more sustainable future — the peer review team has been collaborating more closely with the editorial teams to implement various innovations suggested by the editors. This marks a shift for us, as the editorial teams have only been part of the strategic plan for about four years. For over half of my peer review team, this is a new experience, as they previously worked directly with the editors before this change. As a result, it has been an adjustment in terms of how they receive information, who they receive it from, and how they process and act on that information.

Alison: There’s an increasingly necessary interconnectivity from all we do to initially attract a submission, manage the submission, publish, promote, and sell. Each element relies on another – e.g., we need a Ringgold ID in the production process to know whether an author can benefit from a read-and-publish agreement. We see journal publishing as a holistic process and work across functions to support each stakeholder. We collect performance data and metadata throughout the workflow.

We are very fortunate at OUP to have exceptionally good cross-functional collaboration. Critical to success has been exceptional communication, but also a willingness for departments to understand the potential consequence of their actions elsewhere and sometimes to compromise with program objectives and success in mind over individual areas. There are painful moments and difficult decisions, but we’re fortunate to have a team prepared to tackle conflict, listen, and compromise. We are also fortunate to be a part of the University of Oxford and able to take a mid-to-long-term view in terms of strategic positioning, which has allowed us to be consistent in approach.

Erika: Absolutely! At PLOS project teams have become more cross-functional, involving folks from entirely different departments. When you’re in meetings with collaborators who perform different tasks than your own, you learn right away that you must adjust your communications to make sure everyone is on the same page. Not only in terms of styles but also in content. What may be understandable to you may be completely incomprehensible to someone else. In a way, it’s been a great form of self-reflection. 

Another essential change we’ve had to make is not being afraid of asynchronous work. We have folks from different time zones, so it’s not always feasible to meet in person. 

What else can peer review teams learn from your experience?

Our contributors wrap up their insights by reflecting on the broader implications of Peer Review Week’s 2024 theme — how innovation in technology can shape the future of peer review. Their final thoughts offer actionable advice for teams looking to navigate the rapidly changing landscape of peer review.

Alison: I’d offer three major points of advice:

  1. Experiment and learn
  2. Stay true to your mission and strategic priorities
  3. Don’t underestimate the human aspect of effective change management

Laurie: Don’t hesitate to explore new ideas! Take the time to thoroughly research and pilot any initiative that you believe could enhance the peer review process for authors, reviewers, and editors.

Erika: Be willing to take calculated, data-driven risks, and remember that progress, no matter how ‘small’, is progress.

Join the Conversation

As we conclude, we invite the broader peer review community to share their own experiences. What have you learned from implementing new technologies in peer review? How have you managed change, overcome hurdles, and supported your teams through these transitions? Your insights could provide valuable guidance to others on the same path, contributing to the collective knowledge around innovation and change management in peer review.

Jasmine Wallace

Jasmine Wallace

Jasmine Wallace is the Senior Production Manager at the Public Library of Science (PLOS). She is responsible for the production processes and day to day production and publication operations for the PLOS journal portfolio. Previously, she was the Peer Review Manager at the American Society for Microbiology (ASM). She was responsible for ensuring peer review practices, workflow, processes, and policies were up-to-date and applied consistently across the entire portfolio of journals. She served as Treasurer for the Council of Science Editors and was the creator and host of their podcast series S.P.E.A.K. In the past, she was a Teaching Assistant at George Washington University for a course on Editing for Books, Journals, and E-Products.

Erika Mann

Erika Mann is a Peer Review Operations Manager at PLOS where she manages vendors, workflow optimization efforts, and a team of direct reports, alongside her fellow Peer Review Operations Managers. Prior to PLOS, Erika spent several years in the tech industry as a marketing and PR specialist. Her favorite pastime outside of work is cozying up with a good book.

Discussion

Leave a Comment