In early August, it was announced that UK Research and Innovation (UKRI) would provide significant funding for a new open publishing platform. Called Octopus, this initiative is not yet fully launched, but when it is it plans to “provide a new ‘primary research record’ for recording and appraising research “as it happens’”; UKRI calls Octopus “a ground-breaking global service which could positively disrupt research culture for the better.” I reached out to Octopus’s founder, Dr. Alexandra Freeman, to ask some questions about Octopus and its plans for the future.
What led you to found Octopus – what problem (or problems) does it exist to solve?
When I came back to academia at the end of 2016, after 18 years working in the media, I was really surprised to experience the scientific publishing system ‘from the inside’. Researchers openly talk about ‘what story they are going to tell’ with their data, publication is very slow, word limits mean that full details of work have to be put into supplementary information, hosted elsewhere or simply left out, and a lot of time is spent on tiny details (even down to fonts) of no relevance to the work itself. On top of that, publications seem to be almost the only way in which researchers’ work is being assessed — and often through very poor proxy metrics. Researchers are being put under enormous pressure to get work through the current publishing system, and yet their outputs are then evaluated by the sorts of outcomes that I was used to in the media (how many readers, how many citations, how ‘impactful,’ etc.) rather than measures of scientific quality.
It struck me that journals are trying to fulfill two roles at once in the current system. On the one hand they are disseminating important findings to their readers, particularly to professionals who may be able to implement those findings in practice. On the other hand they have the somewhat thankless task of being the primary research record, where researchers publish all their work in all detail to a minority readership. These two roles pull in opposite directions: an article that gives full details, all the twists and turns and blind alleys of the real research process, does not make a nice, easily-digested narrative for those who mainly want to know the findings and implications; an article that supplies the ‘edited highlights’ does not give the full details that other researchers need to learn from. Journal articles are best suited to the dissemination role — the “news and views” and editorialized, narrative-driven pieces. What I felt was missing from the landscape was a platform designed exclusively to be the primary research record — to serve the need for publication of information fast, and in full detail, and open to full scrutiny.
Octopus, then, sets out to be a sort of global “patent office” for science. It’s where researchers can publish their work in full detail, and instantly, claiming priority and ensuring that it is described in such a way that another researcher knows exactly what they meant and what they did. It won’t be a good read, but it will allow researchers to display the quality of their science and not their storytelling. And, importantly, it allows us to set the incentive structure on this platform to match what we want to encourage – what is useful for science.
What do you see as the most important differences between your publishing model and a more traditional journal platform?
There are some features of Octopus that are shared with other platforms, but in combination I think they are unique. I’d say some key ones are:
- Instant publication (once all authors agree), ensuring that work is shared quickly and efficiently, with no gatekeepers to the research record.
- Open post-publication peer review, with the option for authors to reversion their work at any time (old versions being available still)
- Breaking down of the traditional ‘paper’ format into smaller publications, allowing review at every stage, smaller author groups, faster sharing, and removing the need for a narrative which relieves the pressure that creates questionable research practices.
- Treating peer reviews like any other publication, which should both incentivize them and the skills that good reviewing needs. Treating constructive critique as a valued scientific skill should help foster a more collaborative approach.
- A rating system based on predefined criteria, set for each different publication type (including reviews). This allows us, as the scientific community, to define what “good” means to us in each case. Of course any metric is a poor relation to “reading and making one’s own mind up,” but we do need metrics, and in Octopus these will be designed to be as close a proxy for quality as we can make them.
- Minimizing cues that can cause unconscious bias (such as using initials rather than first names for authors on publications, no uploading of profile pictures, etc.).
- Every publication needs to be linked to an existing publication, to form branching chains. This is, I think, unique. It means that you cannot publish Data without linking that data to a published Protocol/Method; you can’t publish a Method without linking it to a published Hypothesis/Rationale etc. Only Reviews and Problems can be linked to any other publication type. This is designed not only to encourage good practice in terms of ensuring that each part of the research process is explicitly defined and shared, but also to help discoverability: imagine being able to find every publication associated with a particular research question, instantly.
- Publications can also be “horizontally linked” — a publication that is in one research chain, associated with one research question, might be equally useful in another (e.g., an algorithm that was devised in the context of liquid flow in physics may be equally useful in explaining swarming behavior in biology or crowd behavior in psychology, etc.). Any researcher can add a horizontal link like this, and the links can be rated by others, and the researchers gain credit for the links they have made. This can encourage cross-disciplinary thinking and again increase the collaborative approach.
And one feature that I don’t think will be ready for first launch but which I’m passionate about, is making the whole platform language-agnostic. I want to reduce barriers to taking part in scientific research, and one of those that we can now just about remove is language. The best automatic translation systems are now good enough to allow people to choose the language in which they read and write to the platform — others will be able to read and write in other languages, seamlessly. This should be just about feasible now, and is only going to improve over the coming years, so I want to get that feature working as soon as possible.
Information about copyright on the FAQ page is a bit confusing. Does Octopus require contributors to use a CC license, and if so, does it require a specific one?
Contributors can choose which license they want to apply to their work. I had imagined that the Creative Commons licenses would offer enough scope amongst them for all the possible restrictions authors might want to add to their work. If Octopus needs to offer other license types then I’m happy for that to happen — it should be the author’s choice what license they apply to their work.
Also on the FAQ page, the answer to the question “Can I publish in both Octopus and a journal?” is a bit less than clear. To clarify: will Octopus allow an author to publish, say, an article both there and in a traditional journal?
Of course people can publish elsewhere as well. I suspect in the short term Octopus will act a bit like a preprint server in that respect. Authors will hopefully want to publish quickly in Octopus to ensure they get the credit for their work as soon as possible, protecting their “priority” over it, but then write it up for a traditional journal at some point. In the longer term I think that second stage will modify, so that journals will not be taking primary research papers and organizing peer review, but instead be commissioning reviews, syntheses, editorials, and short “news” articles announcing findings — so authors will instead be pitching those once they have published in Octopus.
Some people are concerned that there are more and more platforms appearing and researchers are being made to jump through more and more hoops. I feel that pain! Octopus is not trying to reinvent everything and add to work—it’s trying to help bring all these places together and to make things easier and faster for researchers. The individual author pages in Octopus will be automatically generated from information collated by ORCiD and through Octopus itself — you will not be able to add to it yourself (this is also partly to minimize those cues allowing unconscious bias, and to avoid cherry-picking of outputs). Octopus will not create restrictions on format of publications — you decide how you want your research to look (no more font or spacing or heading requirements). There will be some metadata entry but this will be minimized (no more ‘enter it here exactly as it is in your manuscript’!). If you use existing platforms, Octopus will try to integrate and work with those so that you don’t have to repeat your work. We’re trying to make researchers’ lives easier: to find the work of others, to assess it, to build on it, and to share your own work.
Is Octopus equipped to host large data sets?
No, Octopus isn’t a data repository. There are many specialist data repositories for different fields, and authors are asked to give a DOI or other unique identifier for data that is deposited elsewhere. The same is true of specialist repositories for protocols, code, or videos—you just link out to them from Octopus.
UKRI has committed £650,000 over three years to support Octopus. What more will be needed to make Octopus sustainable in the long term?
Octopus’ aim is to minimize its costs as much as possible. UKRI has given us exactly what we asked for in terms of finances — enough to build from the current prototype to a launch platform. In the long term, I want to keep Octopus’ running costs as low as is feasible so that it can be maintained through crowdfunding, philanthropy or by a traditional funder. Failing that, the annual running costs would be minimal if spread across academic institutions — probably the cost of a single journal title each.
How do you see the work of Octopus dovetailing with that of the Open Science Framework (OSF)?
I personally use OSF, and other platforms, at the moment. Where data, protocols, analytical code or pre-registrations are hosted on these sorts of platforms then authors will simply link out to it from Octopus. We will try to integrate as much as possible, and allow others to integrate with Octopus.
The website indicates that “anyone logged into Octopus can rate publications, review them or red flag them if they have serious concerns.” Does Octopus take steps to flag or screen out incompetent or even malicious reviewers, or is that function left to the community at large?
Octopus will not be doing any moderation of the platform. Everything that a registered user does will be visible from their personal profile page — the page that institutions and funders will look at. If someone’s publications (which include their reviews) are poorly rated or red flagged by others, this will be visible on that individual profile page. An individual’s ratings of others will also be visible — allowing persistently overly-generous or overly-negative raters to be easily identified as well. Full transparency of this kind of information will, I hope, make it relatively easy for poor behavior to be spotted.
Since it’s open source, Octopus will allow developers to create all sorts of plug-in modules to visualize the metadata created within the platform, looking at researcher behavior, etc.
The FAQ document explains that when faced with serious objections or concerns with their work, an author can “retract” by re-versioning their submission. But in the event of a dispute that isn’t resolved amicably, the author and the complainant are referred “to the authors’ Institutional Integrity Office, or their national office.” Does Octopus have dedicated staff for the purpose of managing and referring these disputes?
No, this will all be automated — and you’ve spotted one of the problems we’ll be working on over the coming months. With ‘red flags,’ when a reader raises one, the first thing that happens is that a flag will appear on the publication and other readers will be able to read the concerns expressed. These concerns will also be emailed to the authors, who will be able to respond. If the initial flagger is satisfied with a response (which could be an explanation or could involve a reversioning of the publication), they can remove the flag. If the dispute is not settled after a certain length of time, or the flagger decides to escalate it, the correspondence will automatically be forwarded to the relevant research integrity office. We are building a database of these to allow this automatic escalation, and of course dealing with it internationally is a bit of an issue! Of course if the authors have left academia, are dead, or otherwise uncontactable, then that red flag may remain indefinitely on a publication — but it is only a warning, and other readers can read the concerns and make up their own mind.
As I say, this feature is one of the things we’ll be working on, alongside research integrity offices around the world.
Right now there is only a handful of items in Octopus, virtually all of them contributed by you. It’s early days of course, but do you have specific goals for ramping up content acquisition, or will you just let the content come organically and see what happens?
Octopus doesn’t exist at the moment — what you’re looking at (science-octopus.org) is a prototype working platform that helps demonstrate the functionality and which we use in workshops and user-testing to get feedback from users. We plan to launch in spring 2022.
What we’re working on at the moment, prior to launch, is “seeding” the Octopus database with existing content. Because every new publication needs to be linked to an existing one, we need to have a good framework of existing publications for researchers to attach their new work too. We’re taking Jisc’s CORE database of Open Access publications and using natural language processing to extract ‘Problems’ (research questions), and hierarchically linking those to each other. This is an interesting project in itself, but it means that when we launch we will hopefully have a good number of existing Problems, and those will link out to existing old-style papers in the CORE. Researchers won’t be able to add new “papers” in Octopus, but they will be able to add reviews and ratings in Octopus of those original “seed” papers. That means that from day one of Octopus it will hopefully already be a very valuable place to search and find publications, and read the ratings and reviews of others. From then on, I hope it will snowball — people will probably initially want to publish work that they wouldn’t have other outlets for (hypotheses without data, small data sets, negative results, etc.) as well as reviews, but also I think that they will want to ensure that they establish priority on work. I can foresee a bit of a “gold rush” of that!
By making Octopus easy to use for researchers, and useful for institutions and funders looking for metrics or well-reviewed protocols to fund, I hope that there will be a virtuous circle. People always say culture change is slow and hard, but I think by making something that makes everyone’s lives easier and having an eye to the incentives, it can actually happen very quickly indeed.