Facebook warned about risks of relying on algorithms to protect children

Facebook is using new artificial intelligence tools to detect child grooming and nudity

Prostasia Foundation warns Facebook to ensure such tools undergo adequate public review

Berlin – November 28, 2018 – New technologies that aim to detect naked images of children and attempts to sexually groom children were discussed at a meeting between Facebook and child protection group Prostasia Foundation and other experts at Facebook's Berlin office today.

"I was going through some old photos with my partner recently that he had never seen before," said Meagan Ingerman, 32 of Oakland, who works for Prostasia Foundation. "We are planning to digitize them and put them online, and it turns out that they include some innocent childhood photos of myself in the bath. I would once have scanned them along with the others without a second thought, but these new technologies could result in them being flagged."

The new photo-scanning technology was announced by Facebook in October, following on a similar announcement by Google in September. Previous technologies could only detect images that had previously been identified as child pornography by law enforcement authorities or by agencies such as the National Center for Missing and Exploited Children (NCMEC) or Britain's Internet Watch Foundation (IWF).

The new technologies are claimed to be able to use artificial intelligence (AI) to identify naked images of children that are being uploaded for the first time. Such images will be referred to human moderators at Facebook and Google, and may also be forwarded on the NCMEC or IWF, who maintain blacklists of such images that can be used to prevent them being re-uploaded.

"I don't want my childhood photos being added to a child porn blacklist," said Ingerman. "I don't even want a human moderator seeing them. The photos are for my family's eyes only. This kind of technology is well-intentioned, but there are huge privacy and security problems that nobody is talking about."

National Security Agency (NSA) whistleblower Edward Snowden claimed in 2014 that intercepted naked photos of those under surveillance were routinely passed around at the agency. Ingerman raised similar concerns about tech companies. "I barely trust them with my credit card details, let alone my nudes. And for today's generation of young people, storing nude photos in the cloud is the norm."

Facebook's global head of safety Antigone Davis admitted that its technology would make mistakes, when speaking to Reuters last month. "We'd rather err on the side of caution with children," she said.

In addition to the photo-scanning technologies, Facebook is also experimenting with technologies that can detect and prevent the sexual grooming of children using the social network. On November 7-8, tech companies came together at Microsoft's headquarters in Redmond to develop a similar prototype tool for other Internet platforms to use to automatically detect online grooming behaviors. Prostasia Foundation, which also attended that meeting, has announced its plans to audit the tool once complete.

"We abhor child online grooming and the distribution of unlawful sexual images of children", said Prostasia Foundation's Executive Director Jeremy Malcolm in Berlin today, "but at the same time we have to be extremely careful about placing too much responsibility in the hands of tech companies to eliminate online child sexual abuse using algorithms. These tools need to be carefully audited for their impacts on innocent people, and may not be the best approach towards preventing abuse."

In addition to promoting child sexual abuse prevention, the Foundation is currently conducting a campaign directed at Russia's top search engine, Yandex, asking the company to do a better job of hiding search results that directly link to images of child sexual abuse on the open web.

"That is an example of a technology that has been tried and tested by Internet companies for years, and which we know is working well," Malcolm said. "But when it comes to using AI bots to snoop on the private photos that people are uploading, or to make inferences from their behavior online, we need to tread a lot more carefully to make sure that these technologies aren't abused."

For further information, please contact:

Jeremy Malcolm (Executive Director)

+1 415 650 2557 – [email protected]

Prostasia’s website: https://prost.asia

###

Prostasia Foundation
18 Bartol Street #995, San Francisco, CA 94133
EIN 82-4969920
Change your subscription    |    View online