Editor’s Note: Today’s post is by Danny Kingsley. Danny is a thought leader in the international scholarly communication space. She works as a consultant and is a Visiting Fellow at the Australian National Centre for the Public Awareness of Science.
There’s been a great deal of movement around Open Research recently. UNESCO adopted its Recommendation on Open Science in 2021. In April 2022, Finland published its Policy for Open Scholarship. In November, Ireland launched its National Action Plan for Open Research and Romania has released its White Paper on the Transition to Open Science 2023-2030. In December, Ukraine announced its National Open Science Action Plan – even in the middle of war. And in the US, the Office of Technology and Science Policy has just announced 2023 to be the ‘Year of Open Science’
Much of the reasoning behind this focus on Open Research is a need to address the reproducibility crisis – Retraction Watch reported there were nearly 5,000 retractions in 2022. This represents a significant (and growing) cost to many players in the scholarly communication process to manage the integrity of the scientific record. Recent detective work to identify a ‘Multimillion-dollar trade in paper authorship’ has highlighted some of the work that publishers are having to undertake to retract the publications that have links to this practice.
Something is rotten in the state of research. And while retracting work that is fraudulent or incorrect is very important, it is expensive and too late. We should be nipping the problem in the bud much earlier – back with the training researchers receive at university. Achieving Open Research (and therefore increasing reproducibility) requires the provision of systematic research training that focuses specifically on research practice.
A recent preprint, “The Australian academic STEMM workplace post-COVID: a picture of disarray” paints a disturbing picture of what is happening down under. The authors surveyed over 500 early career researchers in 2022, following a similar 2019 survey. The issues the survey identified are wide-ranging, but what stands out from a reproducibility perspective is the link between poor supervision and poor research practice.
Academia works on a master/apprentice model for training. This is outdated, inefficient, and problematic. In the survey, fewer than a quarter of the respondents had been part of a discussion about skills development in the past 12 months. Only 13.1% had been provided an example of appropriate research standards, 9.1% of the respondents had had no discussions at all about their career in that time.
Nearly half of the respondents said they had been impacted by poor research practices – with the ordering or inclusion/exclusion of authors the most common issues. Other issues included fabrication, falsifying, selective use of or plagiarizing of data. Nearly 30% of respondents thought these practices were potentially severe enough to warrant paper retraction, staff dismissal, or a grant being repaid. Concerningly, the most common source of pressure to engage in poor research practice was the respondents’ supervisors.
Questionable research practice is not restricted to Australia by any means. A recently published study of 1,336 PhD students from five European countries uncovered that granting guest authorship to a ‘person in power’ was not uncommon, with approximately three in ten respondents indicating they had done so at least once. This practice was more common in the medical, natural, and technical sciences. The study identified that questionable authorship practices are prevalent among early career researchers, apparently reinforced “through a combination of coercive power relations and dominant norms in some research cultures, particularly in the natural, technical, and medical sciences”. And a recently published study of 285 graduate students in Jordan found only 22.5% of the students were aware of authorship criteria, and a lack of knowledge of different practices of scientific misconduct. There was an absence of constructive mentorship support.
Those of us working in scholarly communication are greatly impacted by these poor practices – from the need to retract papers, to the constant updating of the bibliometric record, provision of guidelines, and the need to develop policies and processes within institutions to manage poor practice.
Australia does not have a centralized academic integrity office despite calls for one. The Australian Academy of Science has recently established an award called the David Vaux Research Integrity Fellowship Award to recognise individuals who have led efforts in their or other institutions to foster and promote integrity in science. Rather than focusing on addressing research misconduct (which of course is necessary), we need to be concentrating on the source – the training of researchers. The sector needs to identify and articulate a clear framework or curriculum for training and professional development in areas of ‘research practice’ beyond disciplinary-specific skills and knowledge.
My 2008 PhD looked at the reasons why there was a low uptake of open access in Australia. During the in depth interviews I conducted, I asked about participants’ experience in learning how to write academic papers, and what they were now doing to train up their own students. The findings were revealing. The majority of interviewees had experienced very little training in relation to the writing of papers and the choice of journals in which to publish. And in turn, there was a wide range of positions on training their students, with many saying it was ‘easier’ if they wrote the papers and one saying, “I’m not sure if it’s my responsibility to train them to write”.
Our current system is not serving the needs of research students in relation to learning research practice. This is clearly demonstrated by the range of offerings from multiple providers available to bridge this skills gap. For example, writing papers is a ‘career-critical skill’. For over a decade, the Write a Scientific Paper (WASP) course has been offered to young career medical professionals in Malta, London, Bahrain, and now online. WASP’s ultimate objective is to “impart the faculty’s collective experience to the delegates in this crucial, early aspect of career progress.”
Beyond writing itself, a central skill in the scholarly communication process is peer review. Publishers and library staff alike will be aware of the phenomenon of peer review training sessions being over-subscribed. There have been claims that the “distinct lack of training in peer review has worsened the crisis in academic publishing”. And a recent review article discusses the need for training in peer review, suggesting a peer reviewer residency as a solution.
I subscribe to a ListServ called Biomed News on Scholarly Communication which scans biomedical papers weekly for scholarly communication issues. A recent missive included the following: Becoming an Effective Manuscript Reviewer, How to Be a Great Peer Reviewer, Peer review: improving feedback and Serving as a reviewer. It is not unusual for the list to include this volume of papers published in the academic literature on how to *do* peer review.
As those who are directly affected by the ability of people to complete (and the quality of) peer review, it is not surprising publishers are also stepping into the peer review training space. As a small example, Wiley Author Services offers Peer Review Training, PLOS has a Peer Reviewer Center, and Elsevier’s Researcher Academy covers peer review amongst other areas. Publishers training researchers in peer review makes sense given the current situation. But a PhD is supposed train candidates to become researchers. Knowledge about peer review, copyright, research data management, open research practices, academic integrity, and the like should not be left to publishers to pick up.
It is not just researchers who need this training. Those supporting researchers and working alongside them in the Third Space also need research practice skills. An offering to this cohort is from the Southern African Research and Innovation Management Association which have a Research Ethics and Integrity Online Short-Course, aimed at research managers.
There are some glimmers of hope. The UK Reproducibility Network is working with Research England on a five-year Open Research Programme which includes a Training Workstream that is building “a UKRN training syllabus and resources catalogue”. It’s a good start but this is a sector-wide and international issue. One approach could be to invest in updating existing frameworks such as the 2011 Vitae Researcher Development Framework. Another could be to engage with the National Information Standards Organization.
The Australian preprint makes some recommendations including the need to extend the length of PhD training to allow time for more skill development, specifically skills which benefit in project management and leadership, to better fit them for academia or the wider scientific workforce. There is a program already in place in Australia called the Transferable Skills Toolkit, which focuses on prioritizing tasks, managing expectations, demonstrating achievement, inspiring and managing others. These are all good skills to have, but do not address issues of academic integrity.
Universities often refer to graduate attributes or graduate qualities. While these are admirable and important, in addition we need to consider ‘researcher practice attributes’ and collectively agree on generic research practice skills needed to ensure open and reproducible research. Once a framework is broadly accepted, there is an opportunity to standardize training across multiple providers, allowing qualifications to be built through a global accreditation system.
Let’s face it – none of the players in the scholarly communication space – institutions, publishers, libraries, research offices, academies, and not the academic community itself – are able to solve this alone. We need to work together.