In case you still harbor any doubts, yes the year Two Thousand and Sixteen is going to be a pivotal one when the history of the 21st century comes to be written. This year has seen many things coalesce, my fellow chefs talk about some of them [HERE] and [HERE] and fine words they are too. If you haven’t yet, go read them. I want to talk about something that has been bothering me greatly for a considerable amount of time; the place of facts and reality, and those who seek to bring us closer understanding of those things, in the modern world. No I can’t believe I’m typing that sentence in 2016.
In the summer, after one of the most abject periods of political campaigning I can recall, the UK voting population was asked to advise the government on whether it thought we should stay, or leave the European Union. The vote was marginal (here’s some proper data analysis for you) and yet also compelling: The UK wishes the current government to negotiate our exit from the European Union. The thing that bothered me most was the fact that one side decided to a specific thing that the other side didn’t: The leave campaign chose to attack experts. It chose to attack, belittle and dismiss those who’s business it is to predict the future from the present state of things. It chose also to indulge in a more generic framing of those experts and academics as “Elitist” and “Out of Touch” and worse. Oh yeah, it also lied.
Over on the other side of the pond (full disclosure – most of my family now consists of US citizens), the events of the last week have played out in a manner consistent with 2016 in general. One side brought facts to a culture war. That side lost. There are a few things I feel are worth pointing out about what has gone with these two events. Because those things directly affect the world we live in. Our world is the world of the reality broker. We publish research and other scholarly thinking whose entire and only goal, is to better fit our understanding of reality to the way the universe actually is. Historians and mathematicians; Art and literature and the big three sciences; philosophy and sociology and economics, the list goes on, all trying to better reveal the majesty of reality.
So what has happened this year?
- Expertise, knowledge, and competence are now under direct attack
- The massive scale social networks have a very serious (possibly fatally so) flaw in how they enable their users to filter and process information
- The maximum surveillance nature of the ‘free to use’ web compounds the filter problem
- The triumph of the ad-supported, cumulative-advantage-winner business model has gutted the 4th estate
And all of these things have been coming for a while. The cascade failure of 2016 has roots going back well over a decade.
In November 2009 the Copenhagen Summit on climate change was a few weeks away when the a server at the Climate Research Unit (based at the University of East Anglia) was (carefully and precisely) hacked. The resulting dataset and email trove was disseminated across the internet, seized upon by the usual suspects and, well to cut a long story short, the Copenhagen Summit ended without any meaningful action being taken. The scientists were investigated really quite vigorously, and although they had done nothing wrong, eight separate committees investigated the science… The hacking received a response of, “Whoa, that’s a bit tricky eh! Well wasn’t anybody in Norfolk [UK], so dunno what to do really”. The scientists were criticized for not releasing all their data quicker and more transparently. I’ll come back to that.
This year, we have UK MP and former secretary of state for Education and then Justice, Michael Gove with this quote: “People in this country have had enough of Experts”, uttered during a TV debate on the referendum.Since then The UK Independence Party MP, Douglas Carswell picked a twitter fight with a physicist on how the tides work. You can see the whole thread here
Conservative MP Glyn Davies has tweeted “Personally, never thought of academics as ‘experts’. No experience of the real world.” We’ve come a long way from the image of the 44th Vice President of the United States, Dan Quayle being lampooned because of his spelling abilities. None of the examples above have garnered any political downside for their owners. These are people who ultimately have a say in the available funding for experts and indeed how experts are consulted by government…
Dr Sara Hagemann has experienced a taste of what this new world might look like. The UK government (the civil service) has been very clear with her (and other colleagues) that since she isn’t a UK citizen, they no longer wish for her to be involved in any advisory role.
And then there’s the Uk Department for Education (DfE) informing researchers using the UK National Pupil database that they must share their results with department officials 48 hours before publication.Here’s the money quote “This will reduce the risk that DfE are caught off guard by being asked to provide statements about research the appropriate people have not seen… [… it is] not the DfE’s role to check or approve the outcomes [but that] the right people have had time to digest it.”
I’m worried. I’ve not cherry picked. This is stuff coming in on my infostreams; coming in direct from the people concerned. The people we publish.
My fellow chefs Angela Cochran and David Crotty have written about this, so I’ll not dwell, except to point out that the rise and profusion of increasingly smart Artificial Intelligence (AI) powered info bots (very common on Twitter at least) is very much a double-edged sword. I do believe that AI agents are going to lead to a future that we in scholarly publishing should be exploring vigorously, but right now the negative effects of a bot that’s designed to swamp or pollute online discourse are what concerns me most. The systems that Facebook and Twitter have at least, seem self-evidently not up to the job of correctly curating and filtering this stuff. The problem seems particularly acute because of:
Items 3 and 4)
Facebook, Google and Twitter make their money by selling advertising off the back of a massive personalized database of intentions (I urge you to read this again… 2003 warning us of a future we now live in) predicated on maximizing the delivery of ‘information’ that you will click on. Click-bait hacks the risk/reward and pleasure centers of the brain. That’s why it’s so effective. That’s why a bot that has enough AI to ‘say’ the right things, or simply drop any one of a number of canned comments and links, is so worrying. The powerlaw economics of the ad game mean that the bots win every time. This is an informational Denial of Service Attack. The 4th estate has come to grief because of this. They need the clicks to survive. Nothing. Else. Matters. If you have a Google Home or Amazon Echo device, you might want to think real hard about what happens when those household AI’s start feeding back information to their users in an ambient fashion. “I expect the news to find me…” remember that quote?
I’m worried about all of this because I’m worried about how scholars are going to cope in the event of a sustained attack on their works. I’m worried about the pressure we as publishers might face. I’m worried about how the public is and will be informed about the way the world works. The weekend before the US election, I was in Washington, DC attending a superb workshop on replication and reproducibility. One of the things that struck me though, was the failure to consider the effects of bad actors on a process of transparency. Those maligned scholars from the University of East Anglia were a harbinger. Their data was hacked, cherry picked and distorted for the purposes of political manipulation. They had no tools or defensive mechanisms with which to cope. The same will be true for #altmetrics. The same will be true for open data. The same will be true for preprint article repositories. How do you fancy dealing with a post publication ‘journal’ that expressly goes out and cherry picks data and research and the names of authors for political ends? Sounds fanciful? Here’s a fake newspaper for you. To everyone working on these things, I urge this: please, please, please, think about how this stuff might be misused and abused; think about the anti-patterns. Because somebody will, if they aren’t already. InfoSec isn’t just about the hardware, it needs to be about the information too.
Here at the Kitchen, we’ve had our arguments and disagreements with many over where scholarly publishing is going and should go. But there’s one thing that unites all of us. We all care very deeply about the business of the scholarly record. We ALL care about getting the research and the debate out there, so that the values of the enlightenment can be advanced. We do. Don’t let our disagreements on the how cloud that fundamental fact. The future is here. Yes, it really is very unevenly distributed. We need to figure out together, what we are going to do about that.