Updated July 2021: Fighting the spread of child abuse images is an essential responsibility for Internet companies—but there’s a right way and a wrong way to do it. Blocking their systems from accepting uploads of known abuse images? That’s the right way. Using secret and untested artificial intelligence (AI) algorithms to scan all your chats, emails, and photos? The wrong way. Yet in July 2021 that’s exactly what European lawmakers authorized Internet companies to do.
While plans are afoot to challenge this law in court, European bureaucrats are already pushing for more, seeking to make these scanning obligations both permanent and mandatory. That would be a disaster, putting millions of people’s most private communications under the eyes of Big Tech employees.
Don’t fall for the false “think of the children” rhetoric. Voluntary scanning for known child abuse images has been well tested for only a decade. AI tools used in this sector are secret, experimental, and reinforce racial and sexual prejudices. We mustn’t allow carte blanche private surveillance of everything you do online, using these untrusted AI tools.
Governments simply can’t be allowed to get away with this. The fundamental human right of privacy extends to the digital world. That’s why the government can’t read your emails without a warrant. But just like the NSA programs that Edward Snowden uncovered, this new proposal amounts to doing exactly that. The only difference is the gift-wrapping: this time the packaging is “protect kids” rather than “anti-terrorism.”
The Commission attempts to justify this audacious proposal on the basis that files on your device won’t be directly sent to the government, only a unique identifier of them will be. But this is a distinction without a difference. If the solution that the Commission favors is adopted, it means that unique information about every image on your device is being sent to government agents. This puts your privacy at risk, especially because in the hands of repressive governments and other maldoers, it is certain to be misused.
The Commission may claim now that they will only ever be looking for child sexual abuse material (CSAM), but it would be foolish and naïve to believe this.The spying technology that the government is proposing could be used to identify any file on your device, and sooner or later, it will be. Combatting CSAM does not amount to an excuse to conduct blanket surveillance of the population. There are better ways to combat CSAM at the source, that don’t require installing a 24/7 spying device in everyone’s pocket… yet governments are ignoring them.
Sign our petition to the European Commission: NO Big Brother in my pocket!