User Privacy Versus Child Safety Apple to scan user phones for images of child abuse.

Published
Reading time
2 min read
Apple's CEO Tim Cook discussing privacy with a Privacy sign above him

Apple, which has made a point of its commitment to user privacy, announced that it will scan iPhones for evidence of child abuse.

What’s new: The tech giant will include a machine learning model on the device to recognize pornographic images of children stored in the photo library. Privacy advocates said the feature could be used to spy on innocent people.
How it works: When a user uploads a photo from their phone to iCloud, a tool called neural match will scan it for known examples of child pornography.

  • Neural match compares an image’s digital signature, called a hash, to those of abusive images previously identified and validated by at least two child-welfare groups. Upon detecting an offending image, the system alerts a human reviewer who may notify law enforcement.
  • Security experts worry that Apple could expand neural match to process images shared via its messaging app, providing a backdoor into the chat system’s end-to-end encryption.
  • On its website, Apple emphasizes that its technology is designed only to search for images of child sexual abuse. Further, it said it would deny government requests for targeted searches of individual users and for data that doesn’t match the system’s original parameters.

Behind the news: Apple’s CEO Tim Cook has called privacy a “fundamental human right,” and the company boasts that its users have the final say over uses of their data.

  • Privacy was a theme of the most recent Apple Worldwide Developers Conference, where the company showcased features that stymie email trackers, hide IP addresses, and identify third-party apps that collect data.
  • In 2016, Apple resisted U.S. government requests to unlock an iPhone belonging to a suspected terrorist. A commercial cybersecurity firm ultimately unlocked it.
  • Nonetheless, Apple can hand over some user data, particularly  information stored in iCloud, in response to a legal warrant.

Why it matters: Apple has been a holdout for privacy amid a tech-industry gold rush for user data. Its decision to budge on this issue suggests an inevitable, broader shift away from protecting individuals and toward making society more safe.

We’re thinking: Child abuse is a global problem, and tech companies including Facebook, Google, Microsoft, and others have banded together to fight it. While we support this effort, we worry about the possibility — perhaps driven by government pressure — that scanning photo libraries could turn into scanning other types of content, and that aim of keeping children safe could veer toward less laudable goals.

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox