Apple Tackles Child Sex Abuse Imagery: Slippery Slope or Necessary Intervention?

Apple Tackles Child Sex Abuse Imagery: Slippery Slope or Necessary Intervention?

Apple recently announced a new set of features aimed at combatting Child Sexual Abuse Materials (CSAM), which include the ability to scan a user’s phone and iMessages. Since the announcement, the company has reiterated the numerous safeguards that they developed, but privacy advocates have bemoaned the potential for abuse and “mission creep.”

The issue of CSAM proliferation has become an epidemic. The National Center for Missing and Exploited Children (NCMEC) is the U.S. agency responsible for tracking CSAM, has reported an exponential increase in the number of images and videos that it has received growing from 600,000 a decade ago to over 70 million in 2019.

Although pundits have speculated as to Apple’s motivation for releasing the feature, it’s clear that Apple’s status as a market maker means that other electronic service providers (ESPs) will be forced to take note. In the U.S., there is no legal obligation for companies to scan for CSAM, and consequently, many don’t – including Dropbox, Google, Amazon, and others – even though it’s almost a certainty that their systems store the content.

Facebook has been one of the few high profile ESPs to scan each image and video for CSAM, but as their former Chief Security Officer Alex Stamos explained in a recent Stanford Internet Observatory (SIO) video, the typical paradigm for CSAM detection is when content is shared (whether through a messaging app, or as a shared album). Apple’s proposed system works locally on your phone if the feature is enabled, and raises the question of who actually owns your computing device if a tech company can start snooping on images and video that are ostensibly private.

Apple has stated that the feature will roll out in the U.S. first, and will only be deployed in other countries after further examination. They have also unequivocally stated that the technology will only be used for CSAM, and not to satisfy another country’s demands (e.g. identifying “terrorists” or political dissidents).

For photographers, the potential breach of privacy is concerning. Photographers of all ilk – from photojournalists to landscape photographers – have legitimate reasons for ensuring that content isn’t seen by anyone else (human or machine) until they choose to disseminate or publish their images. The notion of Canon, Sony or Nikon running content scans on your camera is horrifying, and not an unfair analogy.

Stanford Internet Observatory Research Scholar, Rianna Pfefferkorn, made the point in a recent round table discussion that technology can’t fix the underlying sociological, historical, and poverty-fueled issues that lead to the conditions (particularly in South East Asia) where child rape and abuse can take place and be recorded.

That said, most experts agree that without a coordinated and concerted effort on the part of ESPs, the proliferation of CSAM will continue unabated. Apple’s solution might catalyze a more coherent industry response, or it might devolve into a slippery slope of a terrible ethical conundrum with a tragic human toll.

We mention the following photographers, articles, and websites in this episode of Vision Slightly Blurred.

Next Post:
Previous Post:
This article was written by

Allen Murabayashi is the Chairman and co-founder of PhotoShelter.

There is 1 comment for this article
  1. Chris Montcalmo at 12:41 pm

    This is a terrifying overreach on Apple’s part. Not to mention, they are now automatically assuming all of their customers are criminals. “I’m sure there’s no CSAM on your phone, but let me see your photos just to be sure.” Today, it’s CSAM, tomorrow it could be political dissenters. Our team has purchased our last round of Apple products.

Leave a Reply

Your email address will not be published. Required fields are marked *