Sea Serpent from "Scientific American".
Image via Wikipedia

Regular readers of the Scholarly Kitchen may know that I tend to be critical of utopian visions for restructuring scientific communication. This is an information revolution and revolutions are always a sea of possibilities, most of which are not realistic. So to be fair, I here present a utopian vision of my own, to get my fair share of abuse, as it were.

It starts with the fact that science is organized around topics, which are basically aspects of nature that are amenable to productive study. The work, the funding, the education, and of course the journals, are all so organized. Thus science is loosely organized into communities of topical interest, bunches of people studying the same stuff in different ways.

The thing is that how they study this stuff, whatever it is, can vary a great deal within a given community. There are many methods. Most importantly, a method used on one community can be similar to the method people in distant communities are also using to study their stuff. Moreover, these methods are being improved all the time, in one community or another. This creates a need to know about methods across distant communities.

But the communication system is built around topics, not methods, so methods diffuse slowly. A methods improvement in one community is likely to diffuse first to neighboring communities, then on to their neighbors, and so on. It can take decades to get to distant communities. Given the new information technologies, perhaps we can greatly improve this situation, speeding up the diffusion of methods in the process.

It will, however, involve some significant changes, which is what makes it utopian. But “methods” is a pretty vague term, so let me provide some simple categories and examples before getting into the possible changes.

Mathematics is perhaps the most widespread category. My working definition of science is “the mathematical explanation of nature based on observation.” Most science journal articles include math, including computer modeling. But math and models improve, so how do these improvements get diffused to all the different, and distant, scientific communities that can use them?

For example, recently an advance in the Monte Carlo method was published in a forest management journal. How long before this reaches the nuclear medicine community, which makes heavy use of Monte Carlo? For that matter, Google Scholar lists 1.8 million articles using the term Monte Carlo, about 800,000 of which were published in the last 10 years. This is a huge distributed community, overlapping most if not all of the topical communities.

There are technological methods, such as the use of lasers, specific computer algorithms or neutron scattering. There are also new materials, new processes, new forms of data analysis, and so on.

The point is that communicating these research methods between communities is potentially just as important as communicating the latest research results within communities. Moreover, given today’s revolutionary new capabilities, one can now ask scientists to be aware of these methodological breakthroughs.

The publishing community has several possible roles to play here, which may admittedly be as utopian as described. First of all, editors and reviewers can require the same demonstration of methodological understanding as they demand of topical understanding. Topical understanding is demonstrated in the first part of a journal article. It is where most of the citations come in. The second part, where one explains one’s methods, typically includes no such demonstration of the state of the method, but it now could and probably should. Adding a methods reviewer might be all it takes.

Beyond this there may well be a market for methods journals. As the Monte Carlo example shows, a de facto methods community may be vast. But such communities are widely distributed among the topical communities, and there are few formal mechanisms for scientific communication of methods. In fact there is a Monte Carlo journal but it is not geared toward the general practitioner. Or perhaps what are called for are reprint journals that filter the topical journals for the best methods stuff. I am not proposing a design here, merely pointing out the potential market.

As it stands, the industry seems to be focused on improving a system that already works well, the system of topical information flows. Perhaps it should be looking at the system of method information flows, as that one is barely served.

Enhanced by Zemanta

Discussion

12 Thoughts on "My Utopian Vision for Communication of Scientific Methods"

David – This is a very good idea and I think not at all utopian. Applying an ontologic layer of metadata (essentially tagging the XML to indicate the relevant methods in each article) would do the job. Fortunately this can be automated so that an editor (or heaven forbid, the author) need not do the tagging him/herself (which would be the line at which the whole thing becomes utopian – if computers can do it, it lies in the range of possibility). Once tagged, the metadata can be employed by global search engines, A&I systems, and specialized “hubs” that aggregate specific methods.

Auto-tagging sounds very useful, Michael. I will factor it into the vision. But auto-tagging will require a large ontology, or at least a taxonomy, of methods, will it not? That sounds like a major undertaking. On the other hand there may be a number of taxonomies around, especially for math.

David, yours appears to be a taxonomy of disciplines, not of methods. I have never seen a taxonomy of methods. However, OSTI has a monster taxonomy of science that includes methods at the third level. See http://www.osti.gov/taxonomy/

Actually it is more complex than a taxonomy. It is a taxonomy mounted onto a thesaurus. The thesaurus has 30,000 terms and 200,000 term-term connections. Each path in the taxonomy leads to a thesaurus term. It is not a pure taxonomy as there are many paths to any given term. I call the whole structure Word Web. It is in the public domain by the way. OSTI just uses it to help people pick search terms for their document databases. It is beautiful.

True, it was meant as a means of organizing methods under the fields where they were applicable (many methods fell into multiple categories) and to provide a browsing tool for the reader.

Ah, methods publishing, a subject near and dear to my heart, as I was part of a team that created a biology methods journal where I was editor in chief for several years. It is an interesting subject, and it remains somewhat difficult to get researchers to write up their methods in detail. As noted elsewhere, researchers are time-pressed, so getting them to carefully document protocols for others is a lower priority than doing and writing up new research results which will more likely lead to funding and career advancement.

Some thoughts:

Over the course of recent history, the methods used in research have become less and less of a part of the formal research paper. The reasons for this stem mostly from a print mindset, that journals were eager to include as many papers as possible per issue, but were limited by the size of the print product (paper and printing costs, mailing costs, etc.). As such, one of the first things to go in order to save space has been the details in the methods section. This is deeply problematic as research results are not trustworthy if they are not replicable. Without an explanation of the methods used, they can’t be replicated.

To get around this, some journals have tried to get authors to write up detailed methods section and stuck that in the article’s supplemental data. This has shown mixed results, as it’s additional effort for the author with little payoff and as most journal editors can tell you, very few readers ever delve into the supplemental data.

But, all is not lost, as we are slowly moving away from print and toward fully online journals. Online journals don’t have the same page limitations and should have plenty of room for detailed methods sections, if editors and reviewers insist on their inclusion.

Furthermore, there has been a trend toward creating methods journals, as this is seen as something of an untapped market (the age of the print laboratory manual is slowing coming to an end). These journals offer the author a place for a methods paper that is peer-reviewed, has an Impact Factor and is PubMed indexed, all of the hallmarks necessary for a work to “count” toward funding and career advancement.

Hopefully, as search technologies improve, these methods will start to be easier to find. I’d like to see indexing services like PubMed create a special category for protocols and methods papers, much as they do for review papers. This would aid discoverability immensely.

David, you raise a number of different points. If I were doing market research for starting new journals focused on methods, then I would look for hot spots. That is, areas with a narrow focus where rapid progress was being made, creating a significant need to know. There are methods articles scattered throughout the topical literature, so it is possible to spot the emerging trends. Emerging methods are no different from emerging fields in that regard. The rapid rise of new language is often the tip off.

My target authors would be researchers whose only claim to fame lies in their new methods. I suspect there are many such. And again, they are getting published in topical journals, but they are potentially much more famous (and cited) in the highly distributed virtual community of potential users.

Marketing has it own challenges, but here too the new technologies may make marketing to a distributed audience possible.

Methods coverage in topical articles is another issue, but very important. My suggestion is to choose a relevant methods expert as one of the reviewers. This might do wonders, not merely for the coverage, but for the rise of new methods. This also applies to proposal reviewers, if not more so.

One problem though, is that when labs that develop new technologies are writing about those new methods, they usually write conceptual pieces that explain the theory behind the method, and provide a few examples of it working. It’s rare though, that they write out a step-by-step protocol, complete with reagent recipes (where appropriate) and troubleshooting advice. And that’s what’s really more valuable to the field and allows for easier adoption, rather than the broader review article.

Traditionally we’ve relied on lab manuals for these sorts of protocols, but fewer and fewer researchers are getting their methods from print books. More searchable online outlets for this type of material and career/funding incentives for making it available would help drive progress.

I don’t think the same problem exists in the social sciences, at least the ones I know best. An explanation of method is de rigeur for journal articles in the social sciences. But there are different levels of method, aren’t there? E.g., in the social sciences one typically distinguishes between investigations using qualitative methods and those using quantitative methods. But that only gets at a very generic distinction: there are plenty of variations of each type of these main two methods. There also, of course, are methods that try to combine the two generic types in fruitful, specific ways.
P.S. If there can be no science without observation, where does this leave superstring theory, which seems to have had no empirical evidence to support it yet>

Sandy, the problem (actually an opportunity) that I am describing is how do people keep up with advances in the methods they use, when these advance occur in distant communities? I am pretty sure that there are widespread methods in the social sciences, so this problem should occur. Response coding methods for example. Explaining one’s method in detail addresses this issue by facilitating global search, which is a starting point, but only that.

Your PS is quite relevant. Science has two legs, observational method and theory, and either may go first. New methods may lead to new mysteries, requiring new theories. Dark energy for example is a theory driven by new observations. Or new theories my require new methods of observation. The Large Hadron Collider was built to observe the theoretical Higgs boson, if it exists.

My point is that science is organized around the theory leg, so rapid communication of new methods is a challenge (also an opportunity for publishing).

Comments are closed.