Many of us find ourselves worrying about trends that feel both potent and recent: a growing inability to distinguish between fact, opinion and lies, and a declining capacity for thoughtful, collective discussion. Someone who has reflected deeply on these issues is Siva Vaidhyanathan, Robertson Professor of Media Studies at the University of Virginia. Trained in both history and American Studies, he brings the long view to the challenges of media, culture and democracy in the first decades of the 21stcentury. The author of five books, his first book on big tech – The Googlization of Everything — and Why We Should Worry – was published by University of California Press in 2011. His latest book – Antisocial Media: How Facebook Disconnects Us and Undermines Democracy, just published by Oxford University Press (go, UPs!) – couldn’t have come at a more important time. We spoke recently about the intricate relationship between media and democracy, and the critical role that cultural institutions – including scholarship, publishers and libraries – need to play in countering this pernicious hold on our attention.
Anti-intellectualism and mistrust of authority have deep roots in American culture dating back to our founding. How does our current moment fit this trend?
Depending on how we want to view it, anti-establishment thinking dates all the way back to the Reformation, and much of it has been healthy, enlightening and interesting. The great intellectual question we’ve had for the last 500 years is: how do we moderate this trend? We want to celebrate someone like Galileo, but not to promote anti-vaccination thinkers. For the most part, we’ve done a pretty good job in filtering out sense from nonsense.
Long before the rise of Facebook or the web, most advanced societies had filters that could place such expressions in a critical framework. That’s harder to do now because we have so many different platforms, and those voices are amplified through algorithms. But before we get to that, your larger question about the historical trend is the more serious one. What’s really striking to me is how this shift is not limited to one part of the world – if it were really a cultural, economic, or political phenomenon in a particular country, it wouldn’t travel. The fact that we’re seeing it worldwide leads me to believe that it’s being driven by mediated amplification.
Recent months have focused on Facebook’s role in the 2016 US election and misuse of data through Cambridge Analytica. But your analysis takes a number of steps back to assess Facebook in its social, cultural and historical contexts, and to other parts of the story that have received less attention – such as the way in which the Trump campaign used Facebook to suppress the vote, especially among minorities.
It’s a long-time political tactic to try to reduce enthusiasm for your opponent but what’s new is the ability to do it surgically, to press a particular issue button with a very narrow range of voters and be able to test the effectiveness. Trump actually didn’t need Cambridge Analytica to profile American voters – Facebook did it for him. This ability to constantly test and improve your ad increases its power over anything that would ever appear on television or radio, neither of which is very precise (and on which the Clinton campaign relied).
No doubt you’ve been closely watching Facebook’s reaction – and Mark Zuckerberg’s — over recent months. Is there anything that gives you hope in what they’ve said or promised in terms of action?
Given how big Facebook is and how fast everything moves on Facebook, the responses are wholly inadequate. That said, I’m not sure that an adequate response exists as long as Facebook remains Facebook. As long as you have an algorithm that favors content that generates strong emotions there’s going to be pollution – the sort of content that distracts and misleads us, or merely gets us angry at each other.
The process of moderation in a system that big is basically impossible and it’s fascinating to me that Facebook remains optimistic enough to think that it can be done. In many cases, their attempts are country specific – for example, they made efforts to police the newsfeed in Germany during the run-up to the election. But these countries matter less every day as the user base grows dramatically in other parts of the world. Even in a democratic country like India with deep roots in scholarship and journalism, we are increasingly seeing attention flow to Facebook and WhatsApp, and away from independent media. The overall ecological effect is far more significant in these countries than in the US and Europe.
It seems to me that Facebook appears to have been very naïve at best in understanding its negative potential — social psychology and other social science research could easily have predicted this 50 years ago.
Absolutely. Had Facebook been more open to self-criticism, had their leaders not been so convinced of their self-righteousness, there might have been an ability to warn about them or address them a decade ago. But ultimately, that would have meant not building Facebook.
There is nothing any individual can do to limit the surveillance power of Facebook…if the response is not collective and political and firm, then the response is inadequate because these problems are so daunting.
You also talk about need to approach solutions not from an individual perspective but from an environmental one – what do you mean by that?
I’m always troubled by responses to privacy and surveillance that are focused on what individual can do – putting the burden on individuals is both unfair and ineffective. And as we now know, it actually doesn’t matter if you’re on Facebook if one of your friends has your contact information in their phone and they use Facebook. There is nothing any individual can do to limit the surveillance power of Facebook – if we care about civil and human rights, we have to face this systematically and politically. When we individualize ecological problems, we fool ourselves into thinking, for example, that our recycling is going to limit the huge ball of trash in the Pacific Ocean – it’s not. If the response is not collective and political and firm, then the response is inadequate because these problems are so daunting.
I’ve just come back from the annual Society for Scholarly Publishing meeting which was animated by the potential of new technologies – especially AI – to transform our industry. Yet if one thing is clear from your book, it’s that we were too eager to welcome technology and its conveniences without consideration of the long-term costs. Is there a cautionary tale for us as we look to our own future?
I would hope that among all the industries in the world, publishing is better prepared to ask hard, critical questions. Analytics and data analysis are so cheap and easy that everyone is engaging in these. The deeper question is how they change relationships with researchers, libraries, scholarly societies – questions about power and money that should be asked early rather than late. And given that scholarly publishing already suffers from a concentration of power from both a market and political perspective, will such changes serve to deepen this, and does that serve the researcher and the research enterprise?
As with Facebook, it’s dangerous to think that we can solve all problems through the magic of AI. It’s clear that machines will learn differently and that different values will be built into the learning algorithms – these are the critical questions that matter. A system that helps manage PLOS ONE submissions at a first level of vetting could be valuable, but who gets to write that system? What changes are allowed along the way? If the machine really is “learning”, what are the goals to which it is adjusting itself? The promise of AI has been “hey, we get to think less”, but the reality is that the burden on humans may be higher as the consequences of AI working across these different realms will be so significant.
If Facebook’s problems are not peripheral, but core to what makes Facebook Facebook, then how should we think about it as a marketing channel or for building communities? Is there an ethical and responsible way for an organization to do this, given what we now know?
I’m not sure anyone has a choice, but it’s getting harder to do effectively. As long as Facebook remains one of the major ways that people learn about things in the world, anyone who is trying to sell anything has to be active on Facebook otherwise you’re leaving out a substantial potential audience. We have to play the game, but just recently, we’ve seen an interesting change. As Facebook has tried to address one of the political problems – no transparency for people placing a political advertisement – they are now demanding that disclosure, which means that publishers can’t buy ads for books and articles that include words such as “democrats”, “republicans”, and “abortion” without declaring themselves to be political action committees. It’s bizarre and counterintuitive, but an example of how clumsy regulating Facebook has to be because it is dependent on algorithms to filter out certain forms of speech.
It’s clear that you don’t believe that actions like the #deletefacebook campaign are the answer and that we have to be thinking on a different level. What is it that we should be doing?
Our powers of intervention are limited in various ways – there’s nothing that can have a universal impact. If citizens of the United States demanded that our government address antitrust and competition problems, there’s the potential to at least break off parts of Facebook. I would like to see Instagram, WhatApp, and their virtual reality project, Oculus Rift separated from Facebook. VR has such potential to impact human behavior that we shouldn’t allow Facebook to play around with it and integrate with its enormous pools of data – the potential for manipulation here is way too dangerous.
These interventions are necessary, but they’re also insufficient as long as Facebook remains Facebook. I would like there to be a rich and deep conversation in which we do not pretend that we can solve this problem with one, two, or three pieces of legislation, or one, two, or three new projects from Facebook. We have to get more creative, be bolder, and get lucky. But I’m also not optimistic that we can do this – we have to break out of our habit of thinking that we can just create another layer of technology to mitigate the problem created by the previous one.
There’s a frenetic process of distraction through our devices that draws us away from newspapers, magazines and books, out of conversations with smart people from whom we could learn.
You note that Facebook undermines efforts to deliberate deeply about important matters and talk about the importance of scholarship over the long-term. You finish with “a plea for a reinvestment in institutions that promote deep thought conducted at analog speed”. How would you like to see us step up to these challenges?
That’s a good idea regardless of whether or not Facebook was ever invented. But now the problem is more urgent – we need other ways for humans to learn about the world and interact with each other, healthier ways, less commercial and addictive ways. We should explicitly build out public forums, we should subsidize journalism, we should enhance our public support for universities, museums and libraries. We should promote and take more seriously the scientific project, which we now completely take for granted. We have to understand that historically the scientific enterprise is key to all the US has built. The leaders of the Peoples’ Republic of China get that and yet we seem to have forgotten. So, building up those institutions is crucial no matter what. But they’re even more crucial now that we have this direct threat to our ability to think, and that’s ultimately what we’re talking about here. There’s a frenetic process of distraction through our devices that draws us away from newspapers, magazines and books, out of conversations with smart people from whom we could learn. The lure and gamification of Facebook is a big part of that – it prompts us to reward Facebook for rewarding us, and that’s why it’s so addictive.
I was as surprised as anyone by the election result and turned to books to help me understand what I’d missed. But I’m not typical — any hopeful advice to end on?
Buying books is great! Buying books to put in your library, raising your kids to do the same, those are really the only individual actions available to us. But I’m impatient with the idea that the burden should be on us. We didn’t screw this up. It’s always good to try to understand the dynamics of the world. I bought twenty or so books after 9/11 to understand – and I don’t regret that, those books are still on the shelves behind me. And yet I didn’t think that it was my responsibility to defeat Al Qaeda. We have to act as global citizens, to care about how Facebook has crowded out other media that might enhance deeper thought. We have the ability to command our legislators – it won’t be easy or quick, but with steady and concerted pressure, we have a shot.