We Don’t Understand Privacy

We Don’t Understand Privacy

Over 1.5 billion people worldwide use the Facebook app on a monthly basis, and all of those people have opted in to Facebook’s privacy policy by the act of creating an account. Facebook, like many large companies, has expended their fair share of effort to clarify and simplify the privacy settings for the massive amounts of data that falls under their custody. But let’s be honest, the majority of users don’t read the privacy policy, nor do they even understand what it is. We simply assume a “reasonable expectation of privacy.”

When I launched the Facebook app a few months ago, I was perturbed. At the top of the screen, the app displayed photos from my camera roll for posting. I had previously granted the Facebook app access to my camera roll for what I thought was a singular use case: Namely, when I chose to share photos on Facebook, I could initiate that transaction. But now the app was making it more “convenient” for me by monitoring new photos that had been added to my camera roll, and suggesting that I upload them to Facebook.



This change in app behavior is no doubt spurred in part by a material decrease in personal sharing on the service. As the service has embraced more professionally created/shared content, users have been sharing fewer personal moments – making Facebook less of a social network and more of a curated search engine.

Unclear to me is whether the photos that were appearing in the Facebook app had already been uploaded and analyzed by Facebook servers. And there’s no way to determine this from the Privacy Policy which ambiguously states “We collect the content and other information you provide when you use our Services…This can include in or about the content you provide, such as the location of a photo or the data a file was created.”

Does merely launching the app give Facebook the ability to scan and analyze my camera roll?

We don’t understand privacy.

This isn’t a theoretical issue. As we have recently seen, significant strides in machine learning have made it possible to identify things, locations and faces – and surprisingly it doesn’t take a full resolution photo to do so. One service, Clarifai, only needs a 256px image to identify images with human-level accuracy (But all machine learning algorithms need a ton of data, so why not mine the billions of photos taken every day?) Now imagine what happens when photos you took that have legitimate reasons to not be seen by anyone else (e.g. embargoed news, behind-the-scenes, otherwise personal and private photos) are uploaded and analyzed by companies claiming that your privacy is preserved because they aren’t being shared.

Facebook’s virtual reality platform, Oculus Rift, has also raised a number of eyebrows with their terms of service including your IP address, “precise location,” and “physical movements and dimensions.” The potential privacy issues are significant enough for Senator Al Franken to send a letter asking for clarifications on data privacy.

We don’t understand privacy.

  • We don’t understand, in part, because the pace of technological change continues to outstrip our ability to comprehend its implications.
  • We don’t understand because companies are desperate to improve engagement more than they are concerned about tackling subtlety in privacy issues.
  • We don’t understand because the short term convenience muddies our ability to consider potential abuse.
  • We don’t understand because machine learning and networking algorithms can transform a seemingly innocuous piece of data (e.g. a photo) into something potentially powerful or destructive.
  • We don’t understand because companies don’t tell us the myriad of ways they might be using our data in supposedly “unidentifiable” ways.

In 2012, billionaire entrepreneur Michael Dell forced his daughter to shut down her Twitter and Tumblr accounts because she was manually posting her location and future schedule, which circumvented Dell’s $2.7m per year security detail. But nowadays, we don’t need a user to reveal their location to know where they are. We don’t even need a face to recognize them.

In some ways the problem seems intractable. Even if you’ve never participated in social media, your photos might be stored in the cloud with Apple or Google, who have more than enough metadata to use your data in ways that you would probably consider a violation. I suspect the solution is a combination of vigilance on the part of consumers to demand clarifications of privacy policies, and for the larger companies to install ombudsman to augment their own privacy advocates.

Of course, the solution will remain murky when we can’t even understand the scope of the problem in the first place.

Next Post:
Previous Post:
This article was written by

Allen Murabayashi is the co-founder of PhotoShelter.

There are 3 comments for this article
  1. Lawrence Hudetz at 11:02 am

    Well, storing images on Photoshelter is a form of cloud storage, and if you don’t “understand the scope of the problem”, what privacy violations, in terms of “use your data in ways that you would probably consider a violation” possibly exists on Photoshelter?

    What is the bottom line here? How can you, the CEO, not understand a basic tenant of cloud storage?

    • Allen Murabayashi Author at 8:07 am

      I understand cloud storage. But the rules surrounding the mining of data that is stored in the cloud by various companies is still very much up for grabs.

  2. Lawrence Hudetz at 11:37 am

    Thanks, Allen. Checking with staff, PhotoShelter is not using a cloud service. However, from my POV, unlimited uploading and storage has the appearance of being a cloud service; in fact, the three plans offered include the term “cloud storage”, hence my comment.

    There is a bit of confusion present. Is Photoshelter a defacto cloud service vis-a-vis Google or Amazon cloud? How does mining data differ here?

Leave a Reply

Your email address will not be published. Required fields are marked *