Privacy has become the watchword in social networking. We all worry about an invasion of our privacy, usually thought of as a direct release of confidential information or an indirect insight garnered by concatenating a lot of little separate pieces of information about us (e.g., knowing when to rob our house by noting travel plans or location of tweets).
Facebook is no stranger to privacy complaints. Despite its checkered past and flashpoint status, Facebook has no choice but to continue to test the boundaries of privacy — its business model depends on people divulging things about themselves. Its privacy policies have been gradually shifting, in ways users realize and in ways users don’t quite see or understand.
. . . the successive policies tell a clear story. Facebook originally earned its core base of users by offering them simple and powerful controls over their personal information. As Facebook grew larger and became more important, it could have chosen to maintain or improve those controls. Instead, it’s slowly but surely helped itself — and its advertising and business partners — to more and more of its users’ information, while limiting the users’ options to control their own information.
Recently, Facebook announced the Open Graph Protocol, which makes it easier for outside sites to share information with Facebook when visitors want to recommend a page.
On the heels of this new initiative, Technology Review interviewed Danah Boyd of Microsoft Research New England. Boyd is a social media researcher and a vocal critic of Facebook’s approach to privacy.
Facebook argues that social norms are changing, and the old definitions of privacy are outdated. Critics point out that Facebook itself is a major force in changing these social norms in its efforts to erode privacy to drive its business. As Boyd says:
I think the social norms have not changed. I think they’re being battered by the way the market forces are operating at this point. I think the market is pushing people in a direction that has huge consequences, especially for those who are marginalized.
We all inhabit multiple roles in life — employee, researcher, parent, spouse, child, friend, neighbor — and what may be fine in one role (sharing a long night with friends over drinks) may look completely inappropriate when seen by people expecting you to fulfill another role (boss, parent, spouse). Erosion of privacy erodes the bulwarks we expect between these, and that can make us nervous or prove embarrassing or awkward.
We’ve all seen religious, political, or social views of old friends and co-workers revealed on Facebook despite the fact that these views have never mattered to our relationships with these people and, worse, may make it harder to look at those people the same way afterward. You can’t unlearn the fact that Person A was just revealed as a Scientologist, for example.
As Boyd notes, it’s especially bad for teachers:
[Teachers] have a role to play during the school day and there are times and places where they have lives that are not student-appropriate. Online, it becomes a different story. Facebook has now made it so that you can go and see everybody’s friends regardless of how private your profile is. And the teachers are constantly struggling with the fact that, no matter how obsessively they’ve tried to make their profiles as private as possible, one of their friends can post a photo from when they were 16 and drinking or doing something else stupid, and all of a sudden, kids bring it into school.
Some reactions to these perceptions of privacy erosion are stronger than others. Some critics urge others to dump Facebook specifically, and accuse Facebook of nearly evil behavior. Business Insider has a list of 10 Reasons to Delete Your Facebook Account. They include:
- Facebook’s Terms of Service are completely one-sided
- Facebook’s CEO has a documented history of unethical behavior
- Facebook has flat-out declared war on privacy
The essential message from the full list is that Facebook is trying to redefine privacy to suit its purposes — commercial purposes based on a plan to become the dominant force online.
Expectations for privacy are very high among the critics of Facebook. As Thomas Baekdal stated in his first rule of privacy:
I am the only one who can decide what I want to share.
In light of this very simple and reasonable rule, it’s tempting (and perhaps too easy) to say that these social networks must reflect social expectations and norms as they exist, and not try to shift them to suit their engineering preferences, business models, or tin-eared anthems of social media utopianism.
However, a recent paper in arXiv calculates a mathematical threshold of privacy for social recommendation engines, one that is probably lower than current social norms would accept. The authors believe their calculations indicate a fundamental limit on privacy in social networks, and show that the more people and recommendations that are present, the more this threshold moves toward a lack of privacy. In other words, to get social recommendations, we have to give up some of our privacy — and the more people who share and seek social recommendations, the less privacy there is. As the authors state it:
This finding throws into serious question the feasibility of developing social recommendation algorithms that are both accurate and privacy-preserving for many real-world settings.
Facebook is a flashpoint among social networks — being the leader, it’s on the forefront of criticism. But if this recent paper is correct, the genre itself may demand a change in social expectations of privacy among users. It may not be Facebook’s fault or Mark Zuckerberg‘s business cynicism at work. It may be reality, and the critics may just be scapegoating Facebook.
Perhaps Facebook’s sense of shifting social norms is right, informed by years of watching a major social network blossom around them. The trade-off their observations might have identified could be: If people continue to use and rely upon social networks, they are implicitly accepting a lower threshold of privacy.