Shame on you, Neil deGrasse Tyson.
When Batman v. Superman: Dawn of Justice — a film I avoided like the plague — showed up on a movie channel during a snowstorm recently, I watched a little bit of it. And guess who makes a cameo? Everyone’s favorite science educator and astrophysicist, Neil deGrasse Tyson, intoning the following as if on a real news program:
We’re talking about a being whose very existence challenges our sense of priority in the universe.
He invokes heliocentrism and Darwinian evolution in the mock interview, as if to ground it further in scientific verisimilitude.
The being he’s talking about is Superman.
Superman is soft science fiction treading the line into fantasy. He’s more Star Wars than Star Trek, to use a common geek distinction. Superman’s story isn’t about exploring scientific concepts and elaborating scientific possibilities. It’s a male fantasy.
Tyson wasn’t the only one whose reputation was caught up in the Batman v. Superman spectacle. The movie also plucked journalists and two politicians into the unwatchable proceedings, including Anderson Cooper, Soledad O’Brien, and Patrick Leahy. But Tyson, a bona fide scientist and science educator, should have known better. Batman v. Superman is no Contact, which was serious science fiction penned by his mentor, Carl Sagan. Contact explored alien contact and its effects on society, politics, and individuals in a plausible, grounded style, basing it all on elaborations of actual scientific principles as humans grappled with change in a plausible manner.
As a representative of the scientific community, I think Tyson was wrong to lend his credibility to a fantasy movie. But this type of blending of science and entertainment is becoming commonplace, and represents a larger and longer-term trend in the way our society shares serious information and topics, a trend that inhibits memory, uptake, and a sense of a shared reality. Tom Nichols writes about this in the book I reviewed yesterday, The Death of Expertise: The Campaign Against Established Knowledge and Why It Matters:
So why do people remain resolutely ignorant and uninformed, and reject news, along with expert opinion and advice, even when it’s all delivered to them almost without effort? Because there’s too much of it and it is too closely fused with entertainment.
We live in an age dominated by television, video, movies, and screens of all sorts, with text becoming more decorative than functional in many of the environments where we find information. Is the medium the message, as Marshall McLuhan said?
Perhaps, the medium is the metaphor. This is Neil Postman’s twist on McLuhan’s quote, a riff that appeared in 1985 in Postman’s book Amusing Ourselves to Death. The book struck me as prescient when it was published, and the passage of time has only cemented that perception. The other day, I dug out my original paperback edition (college bookstore price tag still glued to the back — $6.95), and refreshed my memory. Postman’s premise is that by moving away from textual sources to video and entertainment, we’ve moved the metaphor from print and its expectations of logical satisfaction to a metaphor of spectacle and its expectations of emotional satisfaction.
In the 30+ years since I first encountered Postman’s book, showmanship — polished or tawdry — has replaced serious discussion, changed how information is generated, undermined the demands of communication, and lessened critical thinking. Naturally, this has had detrimental effects on the communication of science. Bring a snowball onto the floor of the Senate to dispute global warming, and you just might thwart climate science for weeks, months, or years. Make a movie about the perils of vaccination, and get invited to the inauguration of a reality-television President. Or, feel comfortable undermining your role as a serious scientist and science educator by lending your credibility to a fantasy spectacular, as Tyson did.
Seriousness has been replaced by fatuousness, even at the highest levels. What some call approachability has consequences, including giving laypeople false confidence about how much they know, how well they think, and how clearly they understand complex ideas and intellectual cultures.
The complexity of science does not survive in an environment in which simplicity and emotional gratification are routinely sought. In fact, simplicity may make it easier to reject facts people don’t like, as an article in the Guardian captured recently:
. . . could it be that non-scientists, emboldened by easy-to-digest science stories in the media, now have the confidence to reject what scientists say, or go with their gut feeling instead?
The paper that spurred the Guardian article is from the journal Public Understanding of Science. This research suggests that the simplification of science can actually make science less viable among the lay public, giving laypeople what the authors call “the easiness effect,” which causes them to transform understandable lay summaries into more confidence in their own ability to judge scientific claims, increase their general trust in their own judgment, and lower their desire to seek or defer to outside expert opinions. Making science easy actually makes it more emotionally gratifying while fostering false confidence, if these results are to be believed.
What if audiences are starting to smell a rat? What if academic and scholarly publishers are in the thrall of too many digital preconceptions to acknowledge their needs and demands?
Science, like reality, is difficult, frustrating, complex, and nuanced. Television and other modern media offer something easier and more entertaining. Simplified and mostly visual, Postman writes, television:
. . . offers viewers a variety of subject matter, requires minimal skills to comprehend it, and is largely aimed at emotional gratification. . . . Entertainment is the supra-ideology of all discourse on television. No matter what is depicted or from what point of view, the overarching presumption is that it is there for our amusement and pleasure.
Typical science education now suffers from the instinct or demand (the metaphor of entertainment) to offer emotional gratification in place of frustrating complexity and painstaking logic, with the term “empirical entertainment” being used recently to describe shows like MythBusters and others that attempt to bring science to the masses, calling the personalities leading the escapades “science entertainers.”
From podcasts to television programs, science educators/entertainers may be failing to recognize that the medium they now use imposes severe limitations on its practitioners and its audience, including:
- Ideas organized to maximize entertainment value
- Constructions that thwart uptake and recall
- Inherent simplifications to support the dominant goals of striking visual imagery
- Story arcs that mute logic in order to maintain emotional engagement and gratification
Compare a television show like the recent version of Cosmos — which pushed large scientific concepts into cartoons and dramatizations focused almost entirely on emotional gratification through storytelling — with reading complex scientific literature about electrical conduction, astrophysics, or biology. You learn from the latter, but are entertained by the former. Studies suggest the difference is night and day when it comes to comprehension and recall. Perhaps this helps to explain why our cultural progress over the past 40 years has not been toward more scientific literacy, but rather away from consensus and comprehension. We don’t learn as much, and we can’t remember it anyhow, so we go with what was actually instilled by simplified science and entertaining experts — an inflated sense of our own abilities and an inherent message that how we feel matters more than the facts.
News as entertainment, facts subsumed by spectacle — these trends would not have surprised one of the sources in Postman’s book, Aldous Huxley. As Postman writes:
[Huxley] believed that it is far more likely that the Western democracies will dance and dream themselves into oblivion than march into it. . . . it is not necessary to conceal anything from a public insensible to contradiction and narcoticized by technological diversion.
That last sentence bears re-reading, trust me.
It was not always so. A main theme of Postman’s book is that there was a period during which the “typographic mind” came to dominate the world, one in which:
. . . print put forward a definition of intelligence that gave priority to the objective, rational use of the mind and at the same time encouraged forms of public discourse with serious, logically ordered content. It is no accident that the Age of Reason was coexistent with the growth of a print culture, first in Europe and then in America.
But this period is being (has been?) undone by the rise of new communication media, which impose their own rules, their own metaphors:
. . . as typography moves to the periphery of our culture and television takes its place at the center, the seriousness, clarity, and, above all, value of public discourse dangerously declines.
If we continue to participate in the broader cultural movement toward spectacle, we may help to feed the suppression and culture-death predicted by Huxley, not the more commonly intoned Orwellian type. As Postman encapsulated in 1985:
In the Huxleyan prophecy, Big Brother does not watch us, by his choice. We watch him, by ours. There is no need for wardens or gates or Ministries of Truth. When a population becomes distracted by trivia, when cultural life is redefined as a perpetual round of entertainments, when serious public conversation becomes a form of baby-talk, when, in short, a people become an audience and their public business a vaudeville act, then a nation finds itself at risk: culture-death is a clear possibility.
Social media is an extension of the attention-busting entertainment space, imposing its own performance constraints on both producers and consumers. At no point is there a demand for a logical, coherent, informative summary of information, a perspective on events. Facebook and Twitter and Snapchat defy exposition, coherence, and logic. They are temporal, not logical.
With all this shallow and illogical information, what if audiences are starting to smell a rat in their dependence on television and digital? And what if academic and scholarly publishers are in the thrall of too many digital preconceptions to acknowledge their needs and demands?
Among 18-24 year-olds, 19% had read a print newspaper. Moreover, in no age group did the reach of the online edition exceed that of print, and by a good margin in every case.
This may be the case already in the newspaper business. In a recent study in Journalism Practice entitled, “Reality Check: Multiplatform newspaper readership in the United States, 2007–2015,” Chyi and Tenenboim sought to discover how readership has actually responded to all the various digital initiatives newspapers have tried. Their analysis throws many digital assumptions into doubt. One major preconception is that digital newspapers would be more popular among the young. Chyi and Teneboim’s research shows that among 18-24 year-olds, 19% had read a print newspaper in the prior week, while only 8% had read the online edition. Moreover, in no age group did the reach of the online edition exceed that of print, and by a good margin in every case. These and other findings were so shocking to a major newspaper trade association that it:
. . . told her that because her findings showed that moving to digital might not be the best strategy for newspapers, the organization didn’t want to share them with its members.
This type of self-imposed blindness to facts and actual trends may be part of why newspapers have continued to flounder. But, as Jack Shafer of Politico writes:
These findings matter because conventional newspapers, for all their shortcomings, remain the best source of information about the workings of our government, of industry, and of the major institutions that dominate our lives. They still publish a disproportionate amount of the accountability journalism available, a function that’s not being fully replaced by online newcomers or the nonprofit entities that have popped up. If we give up the print newspaper for dead, accepting its demise without a fight, we stand to lose one of the vital bulwarks that protect and sustain our culture.
Here we see false equivalencies in publishing theory, in this case between print and digital. Scholarly publishers also possess some assumptions that digital is superior and more highly desired than print, and more useful. Digital has so many benefits — searchability, reach, and format variety — we can’t fathom that we may be wrong.
Yet, year after year, we face the enduring utility of the PDF, the enduring print circulations of many journals, and the enduring sales of print books. We proclaim death to the PDF, to its tyranny of columns, typography, and pagination. Yet, it continues to gain in popularity, infusing new services and competitors with value while we complain about how outdated it is. Are we as blind as the newspaper trade to the value of print and a useful print proxy? If we’d treated the PDF as a valuable asset and not an afterthought, would Sci-Hub and its ilk have taken every publisher’s PDFs so easily?
The PDF’s power may run deep, in ways that scientific and academic publishers need to contemplate. After all, in the “alternative facts” world we find ourselves, conveying quality, expert accurate information easily and memorably may be more important than ever. Research suggests print conceits — “the typographic mind” — convey these benefits. PDFs are our best print proxies.
There have been private studies in the marketing realm exploring the value of print compared to digital, employing MRIs and fMRIs to detect differences in brain functions based on media. In one study, conducted on the behalf of Canada Post, the findings indicate:
- Paper media have a lower cognitive load (they’re easier to read). Online layouts are inconsistent, chaotic, and intrusive, imposing a higher cognitive load as you have to figure out each one. They are often interrupted by advertising or other distractions. This all adds to cognitive load, making it harder to absorb information. When a message enters the mind easily and makes sense right away, you’re much more likely to encode it into memory. Distractions and page factoring impede uptake and recall.
- Physical media are processed faster than digital media. Physical media are tangible, engaging the mind more immersively, allowing information to flow faster into the brain. This matters because, as one summary of the research puts it, we live “in an era where goldfish have longer attention spans than humans.” Physical media make the most of it.
- Physical material is more “real” to the brain. Physical information has more meaning, a place and time, as another private study summary states. It is better connected to memory because it engages with people’s spatial and temporal memory networks. Your fingers remember the size of the book, the turning of pages, the feel of the cover, the heft of the object, and where you were when you read it. A screen leaves no tactile and little temporal memory, so you don’t remember it as well. This was underscored when I remembered a 32-year-old book from college, and knew where it was on my bookshelves. The pages looked and felt immediately familiar again, bringing back a splash of memories featuring this book and interacting with it at various stages in my life.
- Physical media is more interactive during intellectual work. Users write on pages, fold them, tear them out, post them in public spaces, leave them on desks or tables to peruse again, put them in their briefcases, stumble upon them years later. People interact with paper media more, in short, which again aids recall through repetition and engagement on many more levels than digital.
Moving wholly in the direction of Facebook, Twitter, video, and digital without the infrastructure — intellectual and actual — of print or other physical media may be a profound mistake for science. With the volume of research increasing, a medium that is easier to read, processes faster, and aids memory retention has a lot to recommend it. Yet, rather than bolstering our print editions, we’ve hacked at them so we can afford more social media outlets to drive traffic, fancier screen designs that distract and dazzle, and more navigational options to entice users to . . . well, to our PDFs, mostly, because that’s what they seek in spite of all this.
Maybe scientists and scholars still possess more of the typographic mindset than we believe? Maybe we have a false equivalency between digital dazzle and print practicality?
More than a decade ago, when I helped to launch a video article series at the New England Journal of Medicine, the editors insisted we add a PDF summary of the findings to the videos. The project team thought this was just more work and, ugh, PDFs, but the popularity of these downloads proved the editors were right. They felt these summaries were important to highlight the logical course of action and to underpin recall. To this day, PDF summaries are still created for every video, and they are still popular. I’ve seen them printed out and left in residency program inboxes for trainees on more than one occasion.
The spectacle of new media is hard to resist, and publishers seem susceptible to the same siren songs that have drawn newspapers away from potentially more effective and profitable directions. More importantly for scholars, educators, and researchers, we may be moving away from what works for deep reading, logical constructions, and demanding intellectual interactions. We are moving away from the tried and true conceits and practices from the era that generated the typographic mind — writing, exposition, and explanation. We may be falling prey to a misleading notion that digital provides an equivalent — some might say, superior — experience via spectacle and moving pictures. Those assertions may be exactly wrong.
As perhaps the guardians of many vital aspects of intellectual life, such as facts, truth, and accurate information, our awareness of the pitfalls of spectacle has to improve. Our allegiances to the traits of the “typographic mind” could be strengthened with a willingness to accept the constraints and benefits of print proxies and what our audience truly prefers. It may be time to view the PDF and all the other pedestrian and popular imprints of the typographic mind with a smidge more respect.