Author’s Note: This post is based on a talk I gave this week at NISO Plus 2021.

25 years or so after journals first went online, we’re just on the cusp of realizing what that really means in terms of reporting research results. Our first efforts, really our first decades, were spent recreating the analog print experience for journal readers – monthly issues filled with PDFs of laid out, space-limited articles. But there’s a lot that happens over the course of a research project, and while the resulting paper provides a really useful summary of that project, a lot gets left behind and never sees the light of day.

What we’re realizing as a community, is that we’re leaving an enormous amount of value on the table, and that if we can do a better job of capturing, preserving, and making available more of the research workflow, we’ll drive better transparency and reliability of the research conclusions, and improve efficiency and the return on the investment we make in research funding.

On the surface, this seems like an obvious idea, creating a detailed public record of everything that happens every day throughout a research project. In the real world though, this runs into practical limitations. Storage space, discovery, and infrastructure issues aside, this is a huge ask and a huge timesink. In nearly every talk I’ve given over the last 15 years or so, I’ve used some variant of the phrase, “time is a researcher’s most precious commodity”, and this continues to ring true.

Time spent in record-keeping, documentation, and publication of those records is time not spent doing experiments. Any changes made in reporting requirements are going to take the researcher away from the bench or bedside. Doing things right needs a combination of strategies to reduce the burden on the researcher and to offer rewards significant enough to justify the remaining burden. Further, there’s a lot of stuff a researcher does that doesn’t go anywhere or that would have little public value in contributing to research reliability and reuse. How do we separate the wheat from the chaff?

person using a recipe while baking

Choosing the right starting points

Rather than an immediate and probably impossible sweeping cultural change to radical transparency, it’s better to start with the parts of the research workflow that can offer the most obvious value and cause the least burden for the researchers. The end goal is, of course, completely open research, but it’s likely going to be a long road to get there so we need to choose steps that will quickly provide value in order to build momentum going forward.

Open data is an obvious first step and we’re increasingly along the way to making that a standard part of any research project. What’s really helpful about the open data movement is that it has created a model for how we open up other parts of the research workflow – offering standards and best practices that can be applied and adapted elsewhere.

Nevermind the data, where are the protocols?

But data alone is not enough, and an enormous hole in the open science movement has been the lagging attention paid to the reporting of research methodologies. Being able to review the data behind a study does indeed allow one to see if a researcher’s analysis and the conclusions drawn are accurate for that dataset. But it does little to validate the quality and accuracy of the dataset itself. If I don’t know how you got that data, I have no idea if it’s any good, and I certainly don’t stand any chance of replicating it.

A big problem here is that the scant information offered by most journals’ Materials and Methods sections is insufficient to have any chance of repeating what the original authors did. Often when describing a technique, an author will merely cite a previous paper where they used that technique…which also cites a previous paper, which also cites a previous paper and the wild goose chase is on. This lack of detailed methodology reporting is something of an anachronism, driven by decades of a print-dominant publication model aimed at reducing the number of pages in journal issues, along with a lack of incentives to improve methods reporting.

As open data requires the public availability of the data behind any published research conclusions, so open methods would require the public availability of detailed documentation of the procedures used to gather and analyze those data. Like open data, this can happen through a variety of routes — publication of the method as a standalone paper cited by the research paper, detailed documentation of the methods used in the paper itself (or its supplementary materials), or citation of a deposited documentation of the method in a repository such as protocols.io.

Enormous potential for reuse

Just as important as transparency is the increased efficiency offered by open science. One of the big drivers of open data is the potential for reuse of that data. This is also the case for open methods, if not more so. An enormous amount of data generated by research projects is really specific – looking at one particular cell type or geographical region or behavior under really specific conditions. It’s not obvious or easy to repurpose those kinds of data. But methodologies are much more adaptable for new research projects, even ones that aren’t directly related. During our NISO Plus session, Emma Ganley from protocols.io offered an example of a method developed in a study of fish parasites being reused by researchers working on neuron cell cultures.

A huge part of any research project is spent figuring out how to do what you want to do and learning and perfecting the techniques you’re going to use. Having a vetted and successful methodology available can offer an enormous head start.

Lest you doubt the power of the development of new techniques, go back and look at the last 10-15 years of Nobel Prizes in Physics, Chemistry and Medicine – RNAi, CRISPR, Green Fluorescent Protein, Super-resolved fluorescence Microscopy, Methods for introducing gene specific modifications in mice, Optical Tweezers, Cryo-electron Microscopy, Experimental methods that enable measuring and manipulation of individual quantum systems — a huge percentage have been given to those who created the approaches that others are applying to research questions.

Publication Metrics

Ask any journal editor and they’ll tell you that methods articles are nearly always among the most cited articles. This speaks to their value in driving future research, as well as their often broad applicability to different projects. A personal anecdote –I went back and looked at the 15 or so scholarly papers I’ve published and the most cited one is not any of the actual research I did, which was interesting but largely incremental, but the how-to article for a method I spent a couple years working out. This came out in 2002 and yet was still getting cited by new papers in the last year. It’s also been cited in several patent applications, and if you’re a funder looking to drive economic development through funding research, that’s one of the results you want to see.

Making it happen

Making open methods happen is going to take input and effort from a wide variety of stakeholders. We need clear standards, ideally modeled upon the FAIR Principles already in place for open data. Those standards need to apply on many levels, from the content of the methodologies themselves to how they are tagged, cited and made public and discoverable. We need standards and qualifications for repositories and other storage services to ensure reliable, perpetual access. Protocols tend to evolve over time as different researchers find different uses for them, so the branching or forking of methods needs to be considered.

As open data requirements continue to be adopted, funders must also recognize the importance and utility offered and to see methodologies as a valued research output, worthy of appropriate career reward and continued funding. As is happening for open data, researchers will need to shift their workflows and recording behaviors with the idea of eventual public access and utility in mind.

And publishers have a real opportunity to lead here. Just as those brave enough to drive the first open data requirements for publication were able to increase progress and normalize the idea for authors, so too will similar efforts around better methods reporting. Many publishers are already ahead of the curve here, like Cell Press’ STAR Methods, Nature’s Protocol Exchange, and PLOS’ recent announcement of Lab and Study Protocols (please do chime in below in the comments to discuss your journal’s efforts as well!).

Now is the time to move this forward. Put simply, transparency around research methodologies is essential for driving public trust and accurate, reproducible research results.

David Crotty

David Crotty

David Crotty is a Senior Consultant at Clarke & Esposito, a boutique management consulting firm focused on strategic issues related to professional and academic publishing and information services. Previously, David was the Editorial Director, Journals Policy for Oxford University Press. He oversaw journal policy across OUP’s journals program, drove technological innovation, and served as an information officer. David acquired and managed a suite of research society-owned journals with OUP, and before that was the Executive Editor for Cold Spring Harbor Laboratory Press, where he created and edited new science books and journals, along with serving as a journal Editor-in-Chief. He has served on the Board of Directors for the STM Association, the Society for Scholarly Publishing and CHOR, Inc., as well as The AAP-PSP Executive Council. David received his PhD in Genetics from Columbia University and did developmental neuroscience research at Caltech before moving from the bench to publishing.

Discussion

10 Thoughts on "What’s Next for Open Science — Making the Case for Open Methods"

Completely agree David…..the next thing we need to tackle IMO is a better interconnected research article. We have been talking about it for a while, but the inclusion of different research artifacts that substantiate the scientific process will provide huge benefits to the community. I have some theories as to why this isn’t already happening but would be curious to learn what others think.

Great piece! Cell Press has expanded STAR Methods into STAR Protocols: an open access peer reviewed protocols journal to help scientists share their protocols, particularly as a complement to their full length research article. We have been building this resource in partnership with feedback from researchers. I would be happy to chat with any of you further about our progress. https://www.cell.com/star-protocols/home
https://star-protocols.cell.com/

This is great stuff David. As I see it open outputs are looking good but not so open processes. My impression talking to a leading funder is that they have not really got this and my impression also interviewing ECRs (ciber-research.com/harbingers-2) is that they do not know much about open science. You have a lot of influence for example with STM which has recently done some great work so do please follow up on -Put simply, transparency around research methodologies is essential for driving public trust and accurate, reproducible research results. I hope SK will let me enclose my email in the interest of openness. Anthony

Excellent blog David, and a great NISOPlus session yesterday as well, it looks like there will be enough people interested in a more formal NISO working group formed out of that session which we are happy to be involved with. On your point on other publisher examples, JMIR Publications have been publishing the JMIR Research Protocols since 2009, I know CSH Protocols started in 2006 (when you were there? I was on campus with a Chinese delegation), Center for Open Science are doing interesting work on Registered Reports. As well as ensuring there’s two-way linking to the protocols and final published article, there are also challenges how some of the indexing services like WoS and Scopus index and classify specific protocol only publications. The metrics and impact angle you noted is another important one to dig into as well.

Thank you for highlighting the transformative potential of open access methods! We at Cell Press agree and have just launched Cell Reports Methods, a new open access methods journal, to bring these resources to the community. Our papers use the STAR Methods framework, and we are partnering with STAR Protocols to complement the methods we publish with a user-friendly step-by-step protocol, bridging the gap between the principles behind the method and the procedure. We are multidisciplinary and cover a wide range of biological and technological advances. You can find us here for more information: https://www.cell.com/cell-reports-methods/home.

As a researcher I successfully use inter-disciplinarity and multi-disciplinarity among various theoretical approaches: for example, after a geo-historical investigation, archaeology allowed me to demonstrate that time is a concept instead of a physical phenomenon.

Let’s not forget the ingredients in your scientific recipe!!!

Lovely piece, thank you. Protocols.io and the methods/protocols journals will be a great way to improve the lack of love that the Methods section has been given in the last few decades. Thank you also for the shout out to STAR Methods, it was a really momentous effort on the part of Cell Press.

I wanted to point out that the reagents that cause most problems, e.g., antibodies and cell lines, need to be part of any solution to improving the transparency of methods. We know that overall less than 50% of antibodies are “identifiable”, but in places like STAR Methods which enforce RRIDs that number jumps to 90%+. We ranked the top journals for antibody identifiability and found the following journals are most “awesome” … perhaps not so incidentally they all enforce RRIDs.

https://www.sciencedirect.com/science/article/pii/S2589004220308907#tbl2

Yes, absolutely! I couldn’t fit the whole talk in one blog post, but we spent a good deal of the panel discussion talking about RRIDs and machine readouts as essential parts of the solution.

What happens when the methodologies and protocols used, are based on a completely different kind of physics? I have closely followed the work being done at Brilliant Light and Power. They use the predictions of the Grand Unified Theory of Classical Physics (Herman Haus-MIT; John J. Farrell-Franklin & Marshall College; Randell L. Mills-Franklin Marshall College, Harvard, MIT) to guide the development of various devices, processes and software. One process extracts energy from an electron of the Hydrogen atom when the electron is induced into fallling to an orbital below ground state. The development has culminated in a commercial ready device, the Suncell. Their Hydrino reaction, as used in that device, has been validated by a well recognized expert in the field of heat calorimetry, Wilfred R. Hagan and by the developer of the theory, Randell L. Mills. That validation is the subject of a paper which is in preprint:

https://assets.researchsquare.com/files/rs-144403/v1_stamped.pdf

towards being published in Nature.

What place would a theory such as that, in so-called fringe physics, have in your way of treating protocols and methodologies? The problem this theory and this has company has is, that too many view that work as potentially fraudulent, simply because the theory used is not recognized by academic physics, when compared to Standard Quantum Mechanics.

I’m not sure that the controversial nature of a given piece of work makes a difference in this context. Either you have a detailed and consistent methodology or you don’t. If you’re publishing a research result, then you should supply that methodology so that others can replicate the experiment or use the method elsewhere. If anything, having detailed methodological descriptions would be ideal in a situation where the result is controversial (e.g., cold fusion) as it would let people understand where misleading actions might have been taken as well as rapidly see if the phenomenon described occurs in a consistent manner in the hands of anyone other than those making the controversial claims.

Comments are closed.