The Philosophy of Facebook (or, the real reason Facebook doesn’t care about privacy)

Update: This is an early brain dump of some of the core concepts in my PhD thesis, which I’ve now completed. For a more fleshed out version of this, please see my dissertation, which can be downloaded here.

To say that Facebook does not care about privacy is really only half the story. Maybe even less than half.

Since the very beginning of Facebook, the company has consistently pushed the privacy envelope, but few people seem to be really asking why. The common conception seems to be that Facebook is simply making poor, ill informed privacy choices (frequent use of the word ‘blunder’ or ‘misstep’ in stories describing the latest Facebook privacy issue, for example). The Social Network, Aaron Sorkin’s cinematic account of Facebook’s founding and early days, portray site creator and CEO Mark Zuckerberg as a somewhat socially and emotionally inept genius, motivated to create the site by a desire for women and entry into one of the prestigious Harvard final clubs. This account is perhaps unsurprising, given that the film was adapted from Ben Mezrich’s book The Accidental Billionaires, which was based on evidence from many parties, except from anyone actually at Facebook, including Zuckerberg himself.

Nick Bilton, a New York Times tech writer says that when he asked a Facebook employee what Zuckerberg thinks about privacy, the employee laughed and said “He doesn’t believe in it.” This gets us a bit closer to what is really going on: Zuckerberg doesn’t believe in privacy because he believes in radical transparency instead. As Anil Dash put it: “Facebook is philosophically run by people who are extremists about information sharing.” Time and time again, Zuckerberg has said that Facebook’s goal is to make the world more open, connected and transparent. He truly believes that improving communication by making it more efficient will make the world a better place. In 2008 at the Facebook Developer Conference, Zuckerberg stated: “In the world we’re building where the world is more transparent, it becomes good for people to be good to each other. That’s really important as we try to solve some of the world’s problems.” Earlier this year, Zuckerberg inadvertently revealed a (secret?) Facebook insignia, hidden inside his hoodie, which doesn’t appear anywhere on the site or in any official Facebook communications. Among other things, the insignia reads “Making the world open and connected.”

But here’s where it gets interesting. What no one seems to have asked is why are Zuckerberg and everyone at Facebook so into transparency or why he thinks being transparent and communicating efficiently will save the world. This sort way of thinking about the world long pre-dates Facebook, indeed, it runs throughout the philosophies embedded in the modern internet. It is the culture of the Californian Bay Area that has codeveloped along with the technologies it has created. Described most simply, it is a form of technological utopianism whose rhetorical roots lie in Norbert Wiener’s cybernetics of the 1940s and 1950s and 1960s American counterculture fused with the computing and digital networking technologies of the 1980s and 1990s. One of the core tenants of this mode of thinking was the belief that flattened hierarchies and the blurring of traditional boundaries — enabled by computing and networking technologies — would bring about a more equal and democratic world where individuals could be themselves and would be free to determine their own destinies. The 1990s saw the infusion of the New Right‘s celebration of free markets and economic liberalism into the mix, which further blurred hierarchies and the boundaries between work/play, personal/professional and producer/consumer. This evolution and merging of philosophies and ideas gave us The Californian Ideology in the 1990s, which spawned Web 2.0 in the mid 2000s.* But the most important aspect for our discussion of privacy, which draws on Wiener’s cybernetics, is the notion that most world problems are problems of inefficient, closed communication, disorder or poor information sharing. Computers, as systems, can be seen as sources of ‘moral good’ as they can solve these problems (see “The Human Use of Human Beings” for more on this). If the entire universe is code (a favourite notion of Kevin Kelly), then the conversion or merging of the analog with the digital would turn the physical world into a manageable system, one that can be indexed, managed, sorted and redistributed (and of course aggregated and datamined as well), thus making the world ordered, open, efficient and transparent. In other words, better. Sound familiar?

Speaking to journalist Jose Vargas in 2010, Zuckerberg said the following: “Most of the information that we care about is things that are in our heads, right? And that’s not out there to be indexed, right?” Think about what we’re doing when we use Facebook. We’re creating digital versions of our relationships, activities, even our identities. We’re turning parts of our lives into code. And it’s not just Facebook. Consider Kevin Kelly’s predictions from his 2007 talk at EG:

“There’s like a billion social sites on the web. Each time you go into there, you have to tell it again who you are and [who] all your friends are. Why should you be doing that? You should just do that once, and it should know who all your friends are. So that’s what you want, all your friends are identified, and you should just carry these relationships around. All this data about you should just be conveyed, and you should do it once and that’s all that should happen. And you should have all the networks of all the relationships between those pieces of data. That’s what we’re moving into – where [the internet] sort of knows these things down to that level… what it’s doing is sharing data, so you have to be open to having your data shared, which is a much better step than just sharing your webpage or your computer. And all of these things that are going to be on this are not just pages, they are things. Everything we’ve described, every artifact or place, will be a specific representation, will have a specific character that can be linked to directly…[the internet of things where a] physical thing becomes part of the web so that we are in the middle of this thing that’s completely linked, down to every object in the little sliver of a connection that it has.” (italics mine)

Note here that Kelly, too, is advocating that for this better world of openness through the merging of atomic and digital. All we have to do is be ‘open to it.’

Indeed, the atomic and the digital can be seen as blurred boundaries, which brings us back again to to the legacy of cybernetics in Facebook. The cybernetic belief that flattened hierarchies and blurred boundaries are a social good can be seen in what has been called context collapse on Facebook, where everyone from various contexts of our lives (friends, ex-lovers, acquaintances, employers and so on) are treated as essentially the same, and we have to present ourselves accordingly. This is not how things usually work in the physical world. We can go drinking with friends and not get in trouble at work, until someone posts drinking photos that your boss sees on Facebook. When I spoke to Zuckerberg at SXSWi in 2008, he told me that he had concluded (based on research he had read) that people were happiest when they were the same in all contexts of their lives, and that was why he had designed Facebook the way he had, with only one profile for all life contexts.

Overall, when this set of philosophies is applied practically in Facebook, this has meant the (further) blurring of boundaries of time and space, public and private, online and offline. Users are now faced with flattened social hierarchies; context collisions; confused relationships and identity management issues. All of which, are essentially, issues of privacy. But, in Zuckerberg’s conception, these are not problems nor threats to privacy. They are simply the growing pains as we get used to a more transparent, more open, more connected, more efficient, and thus improved world. In Zuckerberg’s better world of the future, privacy is obsolete.

*I should note here there is a lot more nuance to this story that I’m glossing over here for simplicity’s sake (see Fred Turner’s From Counterculture to Cyberculture for an a meticulous and fascinating historical account, or Pim van Bree’s nice summation here.)

The Philosophy of Facebook (or, the real reason Facebook doesn’t care about privacy)

14 Responses

  1. Great work and very cool! Can I read your thesis when you’re done?

    The stuff Kevin Kelly says is really creepy. I think I’m one of the few people who actually wants to have my life completely encoded and indexed, but I want it to be in MY control. That’s the elephant in the room these guys seem to be missing.

    Wabasso December 3, 2010 at 10:28 pm #
  2. Wabasso–
    Yeah– that’s the trick with Zuck’s version of “openness”– he wants it all in his walled garden. If they were really concerned with radical transparency, they would be looking to something *standards based,* rather than finding a way to make FB a Wrapper for the Whole Internet.

    If it were based on open standards, YOU could control where your social graph did and did not go, and where you “house” it. Only having one option is ultimately insufficient. The trick is, I should be able to port my “identity” from Facebook to Friendster or Myspace or wherever, and keep it with whomever I trust. Or even set up my own distributed social network manager.

    Tad December 3, 2010 at 11:19 pm #
  3. I think you may be reading a little too much into the “flattening of hierarchies” here – it’s more of a technical constraint than a philosophical choice (though it’s entirely possible to start with a technical constraint, and later misrepresent or even believe it is the consequence of a philosophical choice, rather than the other way around). Fundamentally, categorically, computers are not good at hierarchies. Anything with substructure is computationally expensive; so if you want to do something on a huge scale, you avoid such things wherever possible. Wherever you can prefer “lists” to “trees” you do, because it means you are asking a machine to do less work, which means you can serve more users with less hardware. For exactly the same reason, GMail does not do hierarchical folders of mail, but only “labels” – not because it’s better, but because it’s easier. It does indeed change the way people interact with email, and that can later be justified as an improvement.

    The history of humans interacting with computers is a history of co-evolution. After 40 years, “artificial intelligence” is still a failed experiment. What is much easier is to get people to generate data and behave in ways that are pre-cooked for being chewed on digitally. Facebook is another step in the co-evolution – if you can get people to conduct more of their interpersonal interactions mediated through a machine that offers a limited set of gestures for those interactions, then you get better quality machine-ready data about people. Similarly, cell phones are little generators of machine-friendly data about human interactions and behavior.

    To be pedantic for a moment: We’re not turning parts of our lives into code – code does things and reacts and can even theoretically rewrite itself. We’re turning parts of our lives into data – which is passive – what Facebook does is motivate human beings to babysit that data, and update and add to it.

    Tim Boudreau December 4, 2010 at 2:01 am #
  4. @Tim there’s a false dichotomy, as Ontario’s privacy commissioner would have you know, between trading off between technical constraints and privacy. Yes, if you lump privacy onto the back of a model where it was not considered important from the start, you will face horrible processing issues. You can embed privacy in the very design of your solution so the argument about a technical constraint therefore becomes moot.

    “It’s too hard to do it that way,” shouldn’t be an excuse.

    Krupo December 5, 2010 at 2:53 am #
  5. Wabasso– Yeah– that’s the trick with Zuck’s version of “openness”– he wants it all in his walled garden. If they were really concerned with radical transparency, they would be looking to something *standards based,* rather than finding a way to make FB a Wrapper for the Whole Internet. If it were based on open standards, YOU could control where your social graph did and did not go, and where you “house” it. Only having one option is ultimately insufficient. The trick is, I should be able to port my “identity” from Facebook to Friendster or Myspace or wherever, and keep it with whomever I trust. Or even set up my own distributed social network manager.

    Sharron Clemons December 22, 2010 at 5:49 am #
  6. […] This post was mentioned on Twitter by Shaun Dakin, kate raynes-goldie and Chris Hoofnagle, kate raynes-goldie. kate raynes-goldie said: @trebors love to get your feeback: http://bit.ly/gJ4bz4 it’s a better articulation of what i was trying to get at in our chat at #diy10 […]

    Latoya Bridges December 22, 2010 at 4:59 pm #
Trackbacks/Pingbacks
  1. Tweets that mention The Philosophy of Facebook (or, the real reason Facebook doesn’t care about privacy) -- Topsy.com - December 3, 2010

    […] This post was mentioned on Twitter by Shaun Dakin, kate raynes-goldie and Chris Hoofnagle, kate raynes-goldie. kate raynes-goldie said: @trebors love to get your feeback: http://bit.ly/gJ4bz4 it's a better articulation of what i was trying to get at in our chat at #diy10 […]

  2. World Spinner - December 4, 2010

    The Philosophy of Facebook (or, the real reason Facebook doesn't ……

    Here at World Spinner we are debating the same thing……

  3. Ethnography, ideology & internet research - December 12, 2010

    […] sites like Facebook as neutral and free of ideology (as you can see in my previous post on the Philosophy of Facebook, this just isn’t the case). As an ethnography of Facebook, my research aim is not to create […]

  4. Don’t forget your shovel. » The Social Network - January 3, 2011

    […] was timely viewing actually, because I have just read this blog on The Philosphy of Facebook (or the real reason Facebook doesn’t care about Privacy…. I was a late adopter of Facebook because of my concerns about the privacy issue and the fact that […]

  5. Which Facebook employees will resign in protest of Facebook's approach to privacy? - Quora - January 5, 2011

    […] that Facebook's approach is messed up)More on radical transparency, Facebook and privacy here: http://www.k4t3.org/2010/12/02/t…Insert a dynamic date hereCannot add comment at this time. 1 Answer Collapsed1 Answer Collapsed […]

  6. NativeHQ’s take on Facebook, in the Western Mail today | NativeHQ - January 7, 2011

    […] Yes, people do have multiple identities online and have excellent reasons for doing so. (Facebook’s Mark Zuckerberg has repeatedly failed to emphasise this fact, arguably because it drives Facebook’s growth and chimes with his philosophy.) […]

  7. Privacy in the Age of Facebook: Discourse, Architecture, Consequences - September 13, 2012

    […] Facebook. – In Chapters 5, 6 and 7, I provide the origins, manifestations and consequences of the philosophy of Facebook — or what I call “radically transparent sociality,” which essentially explain why […]

  8. interesting thesis on Facebook and privacy | Josh Whitkin Design - November 22, 2016

    […] Chapters 5, 6 and 7, I provide the origins, manifestations and consequences of the philosophy of Facebook — or what I call “radically transparent sociality,” which essentially explain why Facebook […]

Leave a Reply