I usually prefer to reserve my blog for posting ideas explicitly about things related to social game design. In this post, however, I’d like to address an issue that has been a popular, albeit disconcerting, topic around water coolers lately: Facebook’s privacy controls. In particular, the general anxiety about the fact that Facebook requires users to opt-out if they wish to keep their information private, which of course makes most of their information public by default. You’ve read about it in the news, you’ve read the expert analysis and you’ve even seen very public displays of virtual suicides in protest. Naturally, I thought I’d offer my $.02 on the issue. (I’d also recommend an interesting article by Michael Zimmer and Chris Jay Hoofnagle in the Huffington Post about this issue and what they dub “blowforward” privacy PR, where they claim Facebook rolls out the “new” before getting consensus and then the PR tactic is to backtrack when complaints are made.)
This notion then leads me to my second point, that we should perhaps be less concerned about exposing our personal information to each other, and a little more concerned about who is brokering that exchange. It’s important to keep in mind that while individual privacy settings can be made more private and you can hide things from other site members, all this information is still available to the owner of the network. Nothing is private from them. The point we should be concerned about is that we simply don’t know and might never have clear visibility into what these organizations are actually doing with all of our information. And I’m not just picking on Facebook here – that’s just too easy. We are constantly being asked to share our private information with companies and organizations of all types, and we have very little transparency, let alone control, over what is done with it (clickthrough privacy policies that can be changed at any time by the vendor are hardly a safeguard).
This point makes for a great segue into what I think is the more pressing issue here: data mining. Michael Zimmer, PhD, who is an assistant professor in the School of Information Studies at the University of Wisconsin-Milwaukee, perhaps best illustrates this idea in an essay. Zimmer points out, it’s not Facebook privacy alone, or privacy on social networks in general, we need to be concerned with; what we really need to be concerned about is the data mining going on every time we give out bits of our personal information. Behavioural targeting or what Oscar Gandy, a famous political economist, first identified in the early 90’s as “panoptic sorting,” (a system wherein individuals are continually identified, assessed and classified for the purpose of coordinating and controlling their access to consumer goods and services) is the real worry.
Looking at it from this point of view, it starts to feel (to me, at least) like it doesn’t really matter that someone has to wade through 50 settings and 170 options on Facebook to safeguard themselves from other individual viewers of their profile data. On the other hand, it appears as if the genie is out of the bottle. We can’t just delete our profiles from public spaces, and it wouldn’t make much difference now even if we did. What we need is a way to watch the watchmen. What do you think? Email me at michael [at] ayogo [dot] com.