Update October 14: Success! Apple has announced that in response to the concerns that we and others expressed, it will be suspending its plans to weaken iPhone security for now.
In August 2021 Apple announced its plans to turn every U.S. iPhone into a surveillance device: using your own phone to monitor your images using built-in spyware that can’t be turned off. The measures are twofold: an on-device database of fingerprints (image hashes) of known child abuse images that will scan your photos when they are synced with Apple’s cloud, coupled with an AI-based technology to detect even unknown images of nudity sent to or from a child. Known abuse images, once they cross a threshold, will trigger human review and reporting to authorities, while unknown images will trigger a message to the child’s parents.
Although Apple claims that these technologies will only be used to combat child sexual abuse, it would be naïve to believe that it won’t be expanded. This spying technology could be used to identify any file on your device, and sooner or later, it will be. Combatting image-based child sexual abuse (also called CSAM) does not amount to an excuse to conduct blanket surveillance of the population. There are better ways to combat CSAM at the source, that don’t require installing a 24/7 spying device in everyone’s pocket… yet our society is ignoring them.
Apple simply can’t be allowed to get away with this. The fundamental human right of privacy extends to the digital world. That’s why the government can’t read your emails without a warrant. But just like the NSA programs that Edward Snowden uncovered, this new proposal amounts to doing exactly that. The only difference is the gift-wrapping: this time the packaging is “protect kids” rather than “anti-terrorism.”
We warned that this was coming more than a year ago, when plans for client-side spyware in the next generation of smartphones and computers were exposed in a leaked European Commission paper. Prostasia Foundation wrote to oppose this move, warning that these measures “undermine the security and safety of adults and children alike, while failing to deter abusers from sharing such images by other means.”
Like the European Commission’s proposal, Apple’s CSAM detection technology involves collecting unique information about every image on your device, and secretly sending it to human analysts if the technology classifies it as suspicious. These analysts will have the power to decrypt those images and pass them on to authorities. This puts your privacy at risk, especially because in the hands of repressive governments and other maldoers, it is certain to be misused.
Looking for our European Commission campaign that used to be here? We’ll be relaunching it soon.
Prostasia Foundation is the only advocacy organization that promotes evidence-based solutions to child sexual abuse, while opposing those that infringe fundamental human rights. Donate here to support our work!