Join the “Screeching Voices of the Minority”
Image by PublicDomainPictures from Pixabay

Join the “Screeching Voices of the Minority”

Last week, Apple announced three changes to upcoming software updates aimed at detecting CSAM (child sexual abuse material) while supposedly maintaining user privacy. An uproar ensued among privacy activists, and Apple has doubled down.

Expanding guidance in Siri and Search

This is the third change Apple has announced, and has been the least controversial. Coming to updates to the iPhone, iPad, Apple Watch and macOS later this year, Siri will have specific responses to queries about child exploitation and CSAM:

Siri and Search are also being updated to intervene when users perform searches for queries related to CSAM. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.

Apple’s Child Safety page

This seems innocent enough and hasn’t gotten much attention. There have been problems with Siri and other voice assistants in the past listening in to conversations when they shouldn’t, or exposing audio or other user information to the company and it contractors. If Apple were to begin tracking Siri queries about CSAM, this could also cause issues for academics and researchers who are fighting CSAM and child abuse.

CSAM Detection

Apple attributes this new feature to a collaboration with the National Center for Missing and Exploited Children (NCMEC), a “comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies across the United States.” If you’ve got your device set up to automatically store pictures in iCloud Photos, the CSAM detection system will read photos on your iPhone or iPad before they’re uploaded to Apple’s servers.

NCMEC maintains a database of known CSAM images. The detection feature will compare hashes, or representations of images, on your device with a list of hashes provided by the NCMEC. If there are enough matches in your iCloud Photos library, Apple will manually review your account, disable it, and then report it to the NCMEC, who will partner with law enforcement.

Apple suggests that this system is designed in a privacy-conscious way in a 12-page technical PDF on their site. The technical developments in the system are very interesting, but contradict Apple’s long-standing marketing commitment to user privacy.

Nudity detection in Messages

The third addition for iOS 15, iPadOS 15 and macOS Monterey involves the iMessage app. Apple is adding a system “to warn children and their parents when receiving or sending sexually explicit photos” through the app. The alerts will apply to users on an iCloud Family account. If the child sending or receiving a message with a photo that an algorithm defines as nudity, a series of things will happen. A warning for the child will pop up that the message is potentially dangerous. If the child is under 13 and chooses to send/view the image anyway, their parents will be notified.

Unlike the CSAM detection, this system obviously does not work against a predefined list of content. It uses machine learning to detect the sexually explicit images. When Tumblr implemented a similar policy and automated system in 2018, it triggered false positives on a range of content and triggered the “#TooSexyForTumblr” hashtag on Twitter.

What did the filter have a problem with?

Well, a heart-shaped necklace, a boot-scrubbing design, LED jeans, troll socks, a Louis Vuitton bag, some boxes, a tire, a hanger, a flamingo floatie, shoes, pillows of all sorts, and so much more.

EFF: What Tumblr’s Ban on ‘Adult Content’ Actually Did

The move was largely in preparation for Verizon, which owned Tumblr at the time, to sell the service to Automattic, the owner of popular blogging platform WordPress for just under $20 million.

What happens on your iPhone… goes to Apple?

A billboard on a building. The billboard has a black background with white text that says "What happens on your iPhone, stays on your iPhone. apple.com/privacy" and has an outline of an iPhone next to the text.
Apple billboard in Las Vegas during CES 2019. Image: Chris Velazco on Twitter

Apple has long promoted itself as a privacy-focused company. And it’s true that for the most part, features of its products are better than alternatives like Google or Microsoft. But these changes fundamentally break the premise that “what happens on your iPhone, stays on your iPhone.”

As Aral Balkan says, “This is not a slippery slope. It’s a cliff edge.”

Once technology is in place to detect certain types of images, even with the privacy protections guaranteed in the technical explanation, nothing prevents Apple from deciding to – or being coerced into – adding hashes or dynamic machine learning algorithms for other types of content you store, send or receive. In fact, Apple already uses machine learning to group together pictures of your cats, family members and favorite vacation spots.

Too many opportunities for abuse

Even believing Apple that it will never extend the systems implemented here for other types of content, this system could literally get people killed.

There is no enforced requirement for a parent to put in their child’s birthday accurately in iCloud Family. The parent can say their child is under 13, triggering the automatic notification for sexually explicit content to the parent should the child send or view it. In the case of LGBTQ children and teens in ultraconservative, restrictive and abusive households who are trying to explore and define their sexuality without their parents’ knowledge, tipping off the “caretakers” could create extremely dangerous situations.


The war on drugs International terrorism domestic extremism CSAM is a problem. It existed long before the Internet, and perpetrators will adapt. It is a social problem, not a technical one. It is all too real and terrifying, but cannot be solved by adding privacy-invading features.

In response to obvious privacy concerns being raised by experts, the NCMEC sent a letter to the team at Apple who worked on these features: “We know that the days to come will be filled with the screeching voices of the minority.” (emphasis mine)

Change often starts with a screeching minority drawing attention to an issue. At this point over 6,200 people (including myself) have signed an open letter to Apple asking that:
1) Apple Inc.’s’ deployment of its proposed content monitoring technology is halted immediately.
2) Apple Inc. issue a statement reaffirming their commitment to end-to-end encryption and to user privacy.

The fact that this letter had to be sent and distributed so quickly signals that NCMEC and Apple know that influential and trusted privacy voices have real concerns about these moves. Collectively, our screeches will get louder until they cannot be ignored.