Over the past few years, we’ve been witness to a parade of partisans in the debate over whether the Internet is making us smarter and more capable or turning us into shallow and superficial information parasites.
Nicholas Carr carries the most water for this argument, but others have joined in. Usually, their arguments that we’re going too far, becoming too fragmented, or becoming distracted are positioned to seem as if they have our best interests at heart — concern for our minds, our families, our communities, our culture.
. . . the Never-Betters, the Better-Nevers, and the Ever-Wasers. The Never-Betters believe that we’re on the brink of a new utopia, where information will be free and democratic. . . . The Better-Nevers think that we would have been better off if the whole thing had never happened, that . . . books and magazines create private space for minds in ways that twenty-second bursts of information don’t. The Ever-Wasers insist that at any moment in modernity something like this is going on, and that a new way of organizing data and connecting users is always thrilling to some and chilling to others.
A recent post by Jeff Jarvis puts what he calls “the distraction trope” into perspective. Instead of worrying about whether our brains, families, or communities are changing, Jarvis strips away that sophistry and lays bare something more primal that seems to be at stake:
And isn’t really their fear . . . that they are being replaced? Control in culture is shifting. We triumphalists — I don’t think I am one but, what the hell, I’ll don the uniform — argue that these tools unlock some potential in us, help us do what we want to do and better. The catastrophists are saying that we can be easily led astray to do stupid things and become stupid. One is an argument of enablement. One is an argument of enslavement. Which reveals more respect for humanity? That is the real dividing line. I start with faith in my fellow man and woman. The catastrophists start with little or none.
Throughout history, this fear of losing control has been consistently masked as concerns for higher, even altruistic interests. Jarvis quotes Erasmus (via Elizabeth Eisenstein’s new book, “Divine Art, Infernal Machine“), who said during the proliferation of books:
To what corner of the world do they not fly, these swarms of new books? . . . the very multitude of them is hurting scholarship, because it creates a glut, and even in good things satiety is most harmful. [The minds of men,] flighty and curious of anything new [are lured] away from the study of old authors.
Erasmus was worried about losing control over a world he’d mastered through his knowledge of old authors and stable cultural touchstones, and Carr is worried about losing control over a way of studying and thinking and processing information he’s become adept with. These are not the political leaders of the Middle East who are concerned about destabilization at an entirely different level (but for some of the same basic reasons, and from some of the same fundamental causes). Control has a softer side than anything we’d associate with authoritarianism.
Control can be channeled from competence and tradition. Change threatens both of these.
Competence is a reassuring zone, but when the boundaries change and your competence is no longer relevant, there are two choices — change, or fight to keep the status quo.
Tradition allows status symbols and status signals to exist. Mention that you went to a Bach symphony in a high-tradition culture, and it’s meaningful. Mention it in a high-change culture, and you’re potentially tagged as a stuffed shirt.
The battle for the information landscape has also been one about losing or gaining control, about battles over change versus the status quo.
It’s vital to note here that the Internet is different than any other communications medium precisely because it’s the first that nobody controls, thanks for packet switching, TCP/IP, and other architectural marvels at its center. It’s tradition is change. Competence with it demands change.
The people who first envisioned a network of interlinked computers — an internetwork — also imagined that this marriage of computing capabilities and human capabilities would change human thinking abilities and habits because the approach decentralized knowledge and created a new path forward. As J.C.R. Licklider wrote in 1960:
It seems reasonable to envision . . . a “thinking center” that will incorporate the functions of present-day libraries together with anticipated advances in information storage and retrieval. . . . The picture readily enlarges itself into a network of such centers, connected to one another by wide-band communication lines and to individual users by leased-wire services. In such a system, the speed of the computers would be balanced, and the cost of the gigantic memories and the sophisticated programs would be divided by the number of users.
But Licklider knew this would also change how we think:
The hope is that, in not too many years, human brains and computing machines will be coupled together very tightly, and that the resulting partnership will think as no human brain has ever thought.
So, the visionaries of a new technology hoped it would change how we think. But not only change it, but improve it. He followed this up in 1968 with a sentence that anticipates why you’re reading this:
In a few years, men will be able to communicate more effectively through a machine than face to face.
The stories of Licklider, Douglas Englebart, Paul Baran, and others are recounted wonderfully in the book, “The Master Switch: The Rise and Fall of Information Empires,” by Tim Wu. It’s a book I can’t recommend too highly (and I want to thank Bob Kelly for suggesting it to me). It outlines how in film, phone, and television the power of “the Cycle” of creative destruction and centralization/decentralization is assisted or retarded by government action or inaction, by immovable personalities, by corporate power, and by the pursuit of money.
The clear message from Licklider and others in the 1950s and 1960s was that the centralized knowledge system created by the powers behind AT&T’s telephone monopoly, the studio system in Hollywood, the major radio networks (NBC and CBS), and by radio extending its influence over early television (with broadcast television later trying to stifle cable television) all created an illusion of unity in the United States while stifling innovation. The authoritarian mode of thinking — whether driven by a U.S government that was, in retrospect, surprisingly oppressive; corporate monopolies that sought to perpetuate themselves; or by social scolds who enjoyed the perch atop a centralized cultural replication machine — needed to change.
The anachronists who wish to return to things like “deep contemplation,” “classic texts,” and the like seem to hearken back to a centralized information economy, one which had some virtues, but virtues that were not justified by the expense created in innovation, personal liberty, self definition, diversity, or choice.
It’s not that one is all good and one is all bad. There is a trade-off, an elusive balance, a mix of benefits and traits. In writing that seems prescient to both the pros and cons of humanity’s continuing exploration of its boundaries, Sigmund Freud once wrote:
Man has, as it were, become a kind of prosthetic god. When he puts on all his auxiliary organs he is truly magnificent; but those organs have not grown on to him and they still give him much trouble at times.
We may argue again and again whether the Internet is changing our brains, elevating us, lowering us, making us smarter, or making us stupid. But at the end of the day, it seems the real argument is about control — who has it, who shares it, and who wants it.
So, despite all the partisans, sophistry, and essays about our brains, our culture, our souls, it’s important to remember that what we’re really arguing about is control.
Sometimes, a cigar is just a cigar.