When a research funder or a government agency sets a new policy or set of requirements for funded researchers, they are often taking a leadership role. Open access (OA) policies like that of the Research Councils United Kingdom (RCUK) in 2013 or the current Plan S are put into place to drive solutions to perceived problems. Being a leader, however, comes at a cost (both literally and figuratively), particularly if the rest of the research world doesn’t fall in line.
In the wake of the Finch Report, the RCUK launched their bold OA policy. The policy was undertaken with a sense of the extra weight that the UK would have to carry by taking the lead on OA. For at least a transitional period, costs would climb, as fees had to be paid to make UK researchers’ works free to the rest of the world, but at the same time, subscriptions were still needed to access research from outside of the UK. While the costs of monitoring and compliance were grossly underestimated, it was hoped that a global shift to OA would soon reduce the extra spending levels initially required.
Things didn’t quite work out that way. While the RCUK policy has been enormously successful (over 52% of the UK’s research output is now available through OA channels), much of the rest of the research world chose instead to move toward Green OA policies, and the UK has remained on the hook for those increased spending levels. A great deal of recent efforts in the UK have centered around cost reduction and working with publishers toward creative approaches to offsetting deals.
Plan S faces similar financial issues, largely because so far, its members represent a relatively small slice of the total number of research papers published. US federal funding agencies have stated that they do not plan to sign on to Plan S and despite one public pro-OA statement, there has been no indication that China is going to sign on either. The good news is that offsetting deals (“Read and Publish” or “Publish and Read”) are more common and should help reduce additional spending. However, given the still unclear demands that such deals be “transformative”, these deals may be harder to come by if they require binding promises toward near-future upheavals in business models.
That said, Plan S is a different beast than the RCUK policy because it eliminates hybrid OA journals as a path to compliance, and thus creates a more limited range of options for funded researchers to remain in their funder’s good graces. This has the potential to upset the traditional mentor/mentee relationship in researcher groups, as the needs of the laboratory head may be put in direct conflict with those of the student or postdoc.
To understand the problem, first one needs a clear picture of the financial structure of most science laboratories. The Principal Investigator (PI) is usually the person that applies for and receives the grants that fund the laboratory. They are the ones who will need to prove compliance with the policies of the organizations supplying those funds. The actual experiments are often done by graduate students and postdocs, and in most cases the funding is not in their names, but rather that of their PI lab head. Any benefits of compliance with funder policies (such as continued or future funding) will go to the PI, rather than the student, who if successful, will graduate and leave the lab.
This sets up two separate parties with potentially separate interests — the PI, who wants to make their funders happy in order to continue receiving money, and the PI’s trainees, whose goals are to maximize their career advancement potential. We know that most PhD candidates are not going to find careers in scientific research, and those that do are going to have to find rare jobs wherever they are available. Because funder policies are not universal, and because the person leaving the lab is likely to be going to a different system with different evaluation criteria, this can create a conflict between mentor and mentee.
If you run a lab and have a choice between publishing a paper in a venue that will maintain your own funding, or doing what’s in your student’s best interests (publishing in a venue likely to land them a job outside of Plan S or outside of academia entirely), this puts you in a difficult position. If you’re a student or a postdoc and you’re choosing a laboratory, do you want to go somewhere that may harm your chances against competitors who are not similarly limited in their publication choices? Do you want a mentor who will be required to put your needs second?
To head off any responses that publication record is a poor proxy for judging an applicant, we need to address how things are, rather than how they should be. Plan S requires its members and its funded institutions to sign on to the Declaration on Research Assessment (DORA). A key component to DORA is that the individual work of the researcher should be judged on its own merits, not on the venue in which it was published. While this seems an obvious and sound principle, the problem with DORA is that it does not offer practical solutions to the problem it aims to resolve.
We know that the Impact Factor is deeply flawed as a means of judging the work of an individual researcher or an individual paper. There are reasons, however, why it remains so stubbornly embedded in the evaluation process of researchers. The two key benefits of the Impact Factor are scale (it allows one to quickly evaluate an enormous quantity of applicants without spending hours reading each paper by each applicant) and addressing areas outside of one’s expertise (it allows one to get a sense of work done in a field with which one is not familiar). DORA dates back to 2012, and yet we are still awaiting new methodologies that resolve the problems of the Impact Factor while still offering these benefits. It is unlikely that these entirely novel metrics will suddenly arise and be widely adopted in the ten months before Plan S goes into effect. For now, insistence on DORA means throwing out a flawed methodology and replacing it with, essentially, nothing.
As a result, many of the publisher signatories of DORA still continue to promote their Impact Factor rankings, and many of the institutional signatories of DORA still continue to make hiring and tenure decisions based on publication records of applicants. There don’t seem to be any negative consequences for this dissociative behavior, and even the Wellcome Trust seems unsure on how they would enforce compliance around these matters (“We may ask organisations to show that they’re complying with this as part of our organisation audits.”).
Even if all Plan S signatories strictly enforce the principles of DORA, there’s still a whole world outside of Plan S that may not follow those same rules. For example, China’s research career structure is largely based around the Impact Factor (a key reason why it is unlikely that China will sign on to Plan S). If you’re a Chinese postdoc hoping to eventually run your own lab in China, would you choose to go train in a Plan S-funded lab that will prevent you from publishing that Nature paper that may be essential to your career plans? If you’re one of the majority of PhD recipients who will be seeking a career outside of academia, will you willingly sacrifice the chance to publish that Science paper that will help you stand out from other applicants for an industry job?
Placing strict limitations on PIs may shift the essential relationship between students and their mentors, away from enabling a nurturing environment in which to build a career and toward more of an employer/employee situation where the student is a hired hand brought in to serve the PI’s funding needs. This is why the public responses to Plan S from researchers raise concerns about isolation, worries about recruiting top laboratory members, and losing potential international collaborations.
The success or failure of Plan S is largely going to come down to how the researchers themselves react to it. Previous OA policies have not had the same level of impact on researcher publication practices as Plan S will, and funded researchers are going to be asked to make sacrifices in order to support its goals. While much focus has been put on publishers’ responses, in the end, publishing is a service industry and will adapt to meet customer needs. The researchers themselves will determine what those needs will be and whether Plan S is widely supported or if its negatives outweigh its positives. Researcher response will determine the further spread of Plan S beyond its initial supporting organizations.
Just as the financial implications of a policy must be recognized and planned for, so must the social changes that may result. Creating conflicting pressure between the head of a laboratory and its members is certainly an unintended consequence of Plan S, one that could negatively impact resulting research. How these concerns are recognized and addressed will be a key factor in the potential success of Plan S.