Back to the Future
Image via Wikipedia

Recently, while reading a movie review, I came across something that was really frightening:

Here’s a thought that will fester for all of you folks out there slowly aging as you read this: recently a friend noted that if Universal greenlit and released a remake of BACK TO THE FUTURE today and Marty McFly traveled back the same amount of time as in the original, he would land smack dab in the middle of 1980. Let that sink in for a moment. That’s how BACK TO THE FUTURE looked to our parents.

Yikes. Digging a little deeper, it would seem that the author’s “friend” probably found this idea in Chuck Klosterman’s recent book, “Eating the Dinosaur.

A contributing editor for Esquire magazine, Klosterman is best known for his columns and books analyzing popular culture, with a particular focus on music. I’ve always enjoyed Klosterman’s writing, but I’ve almost always completely disagreed with his tastes. His inherent contrariness is what makes him entertaining because he spends “an inordinate amount of time searching for the underrated value in ostensibly stupid things.”

“Eating the Dinosaur is a step up for Klosterman. While still as contrary as ever, he seems to have matured to a point at which he doesn’t need to force his tastes on the reader. This frees him up to dig deeper into questioning why we do the things we do.

While the Back To The Future factoid provides a compelling lede for a movie review — and it certainly made me feel terribly, terribly old — the review’s author misses out on the deeper point Klosterman makes:

Before [Michael J.] Fox plays “Johnny B. Goode” at the high school dance, he tells his audience, “This is an oldie . . . well, this is an oldie where I come from.” Chuck Berry recorded “Johnny B. Goode” in 1958. Back to the Future was made in 1985, so the gap is twenty-seven years. I’m writing this essay in 2009, which means the gap between 1985 and today is twenty-four years. That’s almost the same amount of time. Yet nobody would refer to Back to the Future as an “oldie,” even if he or she was born in the 1990s. What seems to be happening is a dramatic increase in cultural memory: As culture accelerates, the distance between historical events feels smaller. The gap between 2010 and 2000 will seem far smaller than the gap between 1980 and 1970, which already seemed far smaller than the gap between 1950 and 1940.

Though Klosterman leaves it at that, the Internet is an obvious driving force behind this expansion of cultural memory, and it’s interesting how often mention of “new media” creeps into Klosterman’s trademark in-depth analyses of subjects like ABBA or the wildcat offense in football. Though our society is changing more and more rapidly, we no longer leave anything behind. If you want to explain the term “jumping the shark” to someone, you pull up a YouTube clip of Fonzie from Happy Days. The sorts of things that would live on only in our memories (and get mythologized and embellished due to nostalgia and our faulty recall) are all readily available for scrutiny. There’s something about this that bonds us all together, that allows for a common frame of reference and understanding.

Common bonds aside, is this a good thing? As Kent noted in his recent review of Jaron Lanier’s new book on open networked culture:

Lanier talks about the extended neoteny our rich culture is allowing, and how this extended childhood is exerting an oddly conservative force on our culture — teens today don’t ever lose touch with friends, so don’t reinvent themselves as dramatically as people who could break cleanly from their pasts multiple times over their lifetimes

The same could be said about the cultural relics of our childhood. Instead of leaving behind Rocky and Bullwinkle and expanding our horizons, we can instead stay locked into nostalgia on the flying squirrel and moose’s very own Hulu page.

It’s fascinating to see how Lanier and Klosterman, two very different individuals with very different backgrounds, address very different topics but end up at the same point. Even more interesting is where they coincide with the subject of Klosterman’s final essay in “Eating The Dinosaur” — Ted Kaczynski, the Unabomber.

While Lanier decries the “hive mind” which makes it impossible for our young to develop as “fierce individuals,” Klosterman, in typical contrarian fashion, calls out the Unabomber as a damaged and deranged individual, but notes that much of his manifesto reflects the toll that increased media exposure has taken on our freedom of thought. He presents the premise that Homo sapiens have existed for at least 130,000 years, and for more than 129,900 of those years, any moving image a human saw was . . .

. . . actually real. . . . we were conditioned to understand that seeing something in motion had a specific meaning. But that understanding no longer exists; today we constantly “see things” that aren’t actually there. . . . Is there any possible way that 129,900 years of psychological evolution can be altered within the span of a single century?

This biological inability to cope with the pace of technology, Klosterman (and Kaczynski by proxy) argues, robs us of our “freedom to think whatever we want.” Citing Jerry Mander’s 1978 tome, “Four Arguments for the Elimination of Television,” Klosterman asks the reader to picture a variety of scenarios:

  • life in an Eskimo village
  • a pre-operation conversation among doctors
  • the flight of Amelia Earhart
  • the Old West
  • a basketball game

After you’ve done this, Mander asks the following:

It is extremely likely that you have experienced no more than one or two of [these situations] personally. Obviously, these images [inside your head] were either out of your own imagination or else they were from the media. Can you identify which was which?

The argument is that this pre-programming of our imaginations with stock images and ideas from movies, television, and the Internet takes away the freedom of creating our own visions, of using our own imaginations. As our technology becomes more and more futuristic, the paradox is that it’s burying us in our cultural past.

I don’t buy Klosterman’s (and Kaczynski’s) conclusion — that technology is ultimately bad for civilization — and I don’t think Lanier would agree either. But it does raise interesting questions about the difficulty of originality and creativity in a time of so much media inundation, and how this is affecting our culture.

While the Internet may indeed be making us smarter, is it also making us less interesting and original?

Technology has greatly democratized access to media and the tools of creation, but has this resulted in a revolutionary creative era? Have the past 10 years yielded an new major movements in art, fiction, film, dance, music, etc.? It’s certainly debatable. More people can create and be seen and heard, but has this meant more innovation?

The biggest trends I can see in music in recent years are a garage-rock revival and a re-hashing of “progressive rock“, a style so dreadful that punk rock was invented as an antidote to its overblown sense of self-worth. The wildly original and innovative use of sampling has now degraded into a shortcut for having to write your own melody. The “mash-up” fad speaks for itself: clever, to be sure, but brilliantly original?  Not so much. Perhaps there is value in having to re-invent the wheel every now and again, rather than relying on already existing building blocks.

Once one accepts the loss of freedom of thought, Klosterman gives the reader two ways to react. Kaczynski abandoned society and went off to the wilderness, losing his mind to the point where he saw killing and maiming as legitimate publicity tools to get his message out. Klosterman instead surrenders:

The Internet is not improving our lives. It’s making things (slightly) worse. But because I’m not free–because I’m a slave to my own weakness — I can no longer imagine life without it. I love the internet . . . but I cannot be saved.

I’m not as pessimistic as Klosterman. I do think originality and creative thought can thrive, but as a species, we’re generally lazy and prefer to take the path of least resistance. This age of abundance gives us more of those easy stepping stones rather than forcing us to blaze our own trails. I’d like to think of this as an era of transition, a period where we are still trying to comprehend and move beyond what technology hath wrought.

The challenge, as the pace of technological change increases, is to adapt before the next paradigm buries us in a new set of issues.

Reblog this post [with Zemanta]
David Crotty

David Crotty

David Crotty is a Senior Consultant at Clarke & Esposito, a boutique management consulting firm focused on strategic issues related to professional and academic publishing and information services. Previously, David was the Editorial Director, Journals Policy for Oxford University Press. He oversaw journal policy across OUP’s journals program, drove technological innovation, and served as an information officer. David acquired and managed a suite of research society-owned journals with OUP, and before that was the Executive Editor for Cold Spring Harbor Laboratory Press, where he created and edited new science books and journals, along with serving as a journal Editor-in-Chief. He has served on the Board of Directors for the STM Association, the Society for Scholarly Publishing and CHOR, Inc., as well as The AAP-PSP Executive Council. David received his PhD in Genetics from Columbia University and did developmental neuroscience research at Caltech before moving from the bench to publishing.

Discussion

9 Thoughts on "The Internet’s Extended Cultural Memory — Is It Sapping Our Creativity?"

This is just me pontificating but, Perhaps creativity appears to be sapped due to the background noise that cultural libraries such as the internet provide?

When previously, people we constantly saying the same things to a smaller audience (perhaps even in tandem), now it is possible for one to speak to many with a few mouse clicks. The logical conclusion of this argument is I suppose that we were always as creatively limited, we just didn’t know it.

But how much is this a mirage resulting from us being of the same era talking to each other. I recently showed my kids ‘Whacky Races’ on You Tube – great for me (and my mates). They, my kids, could care less. They know what’s new and relevant.

It isn’t us.

It’s less a question of relevance than it is a question of how this exposure effects our brains and the creative process. Obviously each generation will have its own formative influences, and that’s something that, despite the common frame of reference media availability provides, does not translate.

I’ve had a similar situation to what you describe when showing kids the great Peanuts Thanksgiving and Xmas specials of my youth. These are inevitably met with a yawn and a “so what”. So while you can at least provide access to the actual content, you can’t provide context.

The constant availability has taken away some of the special nature of that content (no matter how mediocre it actually was). Saturday mornings were something special, that’s when the cartoons were on. Now with 24 hour access, cartoons are perhaps devalued. The Wizard of Oz was only on once a year, it seemed like a bigger deal than it does now that you can pull out the DVD any time you’d like. In some ways this lets us make more objective judgments about whether Dastardly Dan’s cartoon was actually pretty crappy, but it seems like something has been lost as well.

The classic text on the challenges of grappling with cultural memory as a creative artist is perhaps the Anxiety of Influence (and its companion, A Map of Misreading) by Yale professor and literary critic Harold Bloom.

Bloom essentially argues that every artist’s work is a result of grappling with the work of their predecessors – and he develops a concept of “misreading” (which really isn’t at all misreading but rather a critical and artistic re-interpretation).

Bloom is primarily concerned with poetry in this meditation, but his point applies to all creative (I use the term broadly to encompass all creation of intellectual or cultural artifacts) endeavors. The interesting thing here, I think, is that poetry has a long and relatively (for major poets going back more than 500 years) complete recorded history. A poet has to grapple with this history (which, if one is Chinese, is indeed even more of a challenge then if one is writing in English). The challenges faced by poets (and writers of prose, painters, scholars, etc.) is now faced by others in fields that previously had more ephemeral natures. Music is a great example. Ditto for film. But it is now a challenge for “critics” (meaning anyone writing, commenting, or discussing) popular culture. I don’t know that there is anything new going on but the Internet is creating challenges for a great many fields and people that were previously limited to those with well recorded and archived corpuses.

Perhaps it’s just a magnification of the same issue. Even for the poet, having constant ready access to the entire canon will certainly have a different effect than knowing obscure poems exist in rare volumes but not having any way to read them.

But you’re right that it’s more pronounced in some fields, like music, where we’ve only had the ability to record for 120-odd years.

“Have the past 10 years yielded an new major movements in art, fiction, film, dance, music, etc.? It’s certainly debatable.”

This is highly subjective territory. As it was mentioned in other comment, people were saying that there is nothing new (e.g. in music) for, like, forever.

“The biggest trends I can see in music in recent years are a garage-rock revival and a re-hashing of “progressive rock“, a style so dreadful that punk rock was invented as an antidote to its overblown sense of self-worth.”

That only means that (1) when you say “music” you probably mean only Anglo-American “popular music” genres and (2) you like neither prog-rock nor punk. Which is by the way fair enough, but what here to debate? This is all matter of taste. (As a side comment: if punk was “invented” as an antidote to prog-rock, it took at least 10 years to invent…)

“The constant availability has taken away some of the special nature of that content (no matter how mediocre it actually was).”

I completely agree with that. And maybe this is not a bad thing. Now at least one has a choice and can watch or listen to things one really wants, and see that there was nothing really special in that content; rather it used to be sold (in a very broad sense of this word) as special.

Yes, it is highly subjective, as I noted, I tried to leave it as an open question. But clearly there haven’t been any obvious large movements in popular music since grunge back in the 1990s. New trends are happening, but nothing has reached the same level of popular consciousness (or sales).

I would include urban African-American music in with what I’m describing, though at this point, that’s fairly indistinguishable from the Anglo-American popular music genre you describe (at least in terms of sales demographics). But even if you limit the analysis strictly to that area, the lack of original trends is evident. I’m not trying to debate whether anything new is any good or of value, just noting that we’re not seeing the same sorts of large movements we’ve seen in recent decades. Though it may speak more to a splintering of interests, narrowcasting, more than a lack of innovation.

And who says I don’t like punk rock? As a high schooler in the late 1970’s/early 1980’s, the movement was essentially dead before I came to it, but still had enough relevance to offer insight to a life outside of the cultural backwater of the suburban South. I do have little patience for the modern revivals of these old styles, as they strike me as empty aping of the form without the same meaningful motivations, the sorts of nostalgia for the past that punk rock was meant to tear down. Did punk take 10 years to invent? Some cite its invention to the first Velvet Underground album in 1967, others to the first Stooges album in 1969. I’d say those were more influences than actual occurences. And I’d peg it to the first Ramones appearance at CBGB’s in 1974, with the creation of a scene that was essentially over within a few short years. It was a very brief movement, that by its very nature ended as soon as it was discovered by the mainstream.

And as I said, there are good an bad things that come with this ready availability of media. The wider range of choice, and the ability to reach likeminded individuals is definitely a plus. But we have lost the sorts of defining cultural moments, things that allowed connection to a wider range of people.

i think so. if for no other reason than that the internet makes us so aware how rare true originality is, and how much like so many other people we are. i would predict that it would create a sense of futility. those artists who live and work ‘off the grid’ might seriously be best served in their careers by remaining so.

not fully committed to this idea; haven’t been thinking about it very long.

I’m actually thinking it’s something along these lines, but more about the dissolution of mass media into personal media and our ability to share media so easily. Mass media created a world of dominant genres against which creativity would shine as a dissent, and shifts in mass media genre preferences would just drive creativity differently — but it would always show up as a contrast to mass media. In a world like this, the perception is that creativity is abundant. Now, mass media is just another media option, and we all get slices from everywhere. So, everything is just a click away, and originality/creativity isn’t accomplished by being a minority presence (everything’s a minority presence) but by being truly original, and, as you say, that’s rare.

Comments are closed.