Arcade Video Game
Image via Wikipedia

In 2004, Wired published an article entitled “Better Science through Gaming,” in which Kristen Philipkoski argued that the process of working with complex data sets could be improved via the adaptation of game-like technologies — and that “software for the life sciences has lagged the consumer market by 20 years.”

The focus of the piece was GeneSifter, a genome analysis software developed by a  video game programmer with a background in life sciences (updated in March 2010):

Based on the original program written by Olson, scientific director at VizX, and Jeff Kozlowski, head programmer at VizX, GeneSifter uses XML to aggregate information from public and private genome databases like the National Institutes of Health’s GenBank, Ensembl in the United Kingdom, and GeneCards in Israel.

It can shorten a project from months to hours. Some even say it’s almost as fun as a video game, at least compared with the alternatives.

Other examples of serious games for science are being pioneered by the Federation of American Scientists as part of their Learning Technologies Projects Group:

In order to help advance our research, as well as promote our vision of what we feel should be the future of learning, FAS has opted for more than a mere academic involvement in the creation of a new model for learning. We are actively involved in the creation of games and simulations that we feel represent some of the best ideas for such models.

(Links to FAS games, simulations, and related projects (which include Medulla, Immune Attack, and The Digital Human Projects) can be found here.)

Further inspiration can be drawn from educational communities that focus on experiential, multi-sensory learning. The Lab School of Washington is a leading school in Washington DC for bright, motivated students (grades 1-12) with moderate-to-severe learning disabilities.

“Lab” incorporates gestural content interactions in its curriculum and summer programs and cultivates skilled debaters, strategists, and communicators — without relying on text-focused skills. Students at Lab learn quantitative analysis and perform risk-reward assessments via simulations. They develop refined reasoning capabilities by encountering dead ends and taking calculated risks to surmount problems — repeatedly, until them become experts. Their particular requirements and mission position Lab, and others of their ilk, as incubators for interactive learning.

Even for mainstream students, gaming is a ubiquitous, informal learning vehicle. From a January piece in the New York Times, “If Your Kids Are Awake, They’re Probably Online,” the average time per day spent by people ages 8-18 gaming is one hour and thirteen minutes compared to 38 minutes per day spent using print.

Dr. Michael Rich, a pediatrician at Children’s Hospital Boston who directs the Center on Media and Child Health, said that with media use so ubiquitous, it was time to stop arguing over whether it was good or bad and accept it as part of children’s environment, “like the air they breathe, the water they drink and the food they eat”.

Over the course of the next 15 years, this community of users who experience content versus strictly reading it will comprise the community of scientists, researchers, and society members who are our customers. It may be difficult for traditionalists to make the conceptual leap from journal or book publishing to scientific simulations and instructional gaming. However, as economics and culture align, these will become part of the fabric of the industry.

Not everyone will thrive in a transformed business landscape. For centuries, scientific publishers have been scribes and disseminators of content who have translated the activity of science into a linear, replicable, two-dimensional experience. Sometimes even the most accomplished companies can’t transition outside their core specialties. (Apple, for example, is an exemplary device manufacturer and marketing company that has been comparatively ineffective in the software space. Microsoft, conversely, has excelled in software but failed to make headway in devices.)

Is it better, then, for publishers to focus on the curation and filtering of content, leaving user services development to others? Or should they be cultivating new skills that prepare them for a different future?

Related topics were discussed at the 2010 Society for Scholarly Publishing Annual Meeting, and experts recommended that publishers explore the edges of the envelope. Paraphrasing:

  • The future of scholarly publishing requires publishers to curate and filter specialized information in ways that increasingly involve provision of knowledge services versus strictly information delivery.
  • Reinvention is not accomplished by pouring content into other containers that are, upon closer inspection, glossier versions of their predecessors that pay homage to the same conventions.

If experimental evolutionary strategies will help publishers compete with start-ups and deep-pocketed technology companies (which lack publishers’ subject-matter expertise and commitment to scholarly and scientific mission), what supports are there for those wishing to invest in more radical innovation?

I have a utopian vision.

Let’s consider driving our own change by creating a self-disciplined and self-reinforcing “flywheel” incubator (akin in its aims to Google’s famed 80/20 innovation policy) that is responsible for exploring and pioneering next-generation, extensible, specialty-content-based technology services on behalf of a scholarly publishing collective, which offers participants economies of scale and knowledge transfer. This could be a community start-up for the technology fringe, channeling the ethos that created BioOne and the model for the FAS’ LTP.

As Jim Collins observes in “How The Mighty Fall: Any Why Some Companies Never Give In“:

[D]ecline, it turns out, is largely self-inflicted, and the path to recovery lies largely within our own hands. We are not imprisoned by our circumstances, our history, or even our staggering defeats along the way. As long as we never get entirely knocked out of the game, hope always remains.

Reblog this post [with Zemanta]

Discussion

6 Thoughts on "Serious Games, Science Communication, and One Utopian Vision"

Hi Alix,
Great post. I enjoyed similar discussions at SSP. We can look to the healthcare market for leadership in reimagining content delivery to fit workflows – yes, the BMJ (for example) still publishes a print journal, but at the other end of the spectrum, they also deliver their information as actionable content for integration into clinical systems.

The first step for publishers is to get much closer to our ‘consumers’. The bonds of trust have been broken and many readers and authors are suspicious of (the business of) publishing. As such, a lot of publishers may have a sketchy, outdated understanding of how their content is actually used. Some publishers are investing a lot in understanding how information is digested and used in research and learning. We all need to focus some time on understanding the core needs underlying current research / information habits, so that (in your utopian incubator) we can innovate to repurpose our content effectively for future needs.

Is this research on “how information is digested and used in research…” published anywhere, or is it all proprietary? Who is doing it?

Hi David, I am not of any that has been published yet, but I know that e.g. Elsevier are active (I think in relation to the Cell Press / Article of the Future). Perhaps we can persuade them to come and speak about it at next year’s SSP!

Comments are closed.