Prostasia Foundation Protecting children by upholding the rights and freedoms of all
A crisis of transparency in the child protection sector

"At INHOPE, we are all about transparency," said Denton Howard, Executive Director of INHOPE, as he informed our Executive Director that he was to be ejected from the INHOPE Summit for live-tweeting about the meeting.

INHOPE is the international network of Internet hotlines, which receive reports of illegal images of minors online. Those reports are triaged by the hotlines and appropriate action is taken—they may be reported to police, or added to an official or voluntary industry blocklist, or if the images are determined not to be illegal, no action might be taken at all.

We were attending the INHOPE Summit because we and other civil liberties groups have longstanding concerns that there are systemic shortcomings in the transparency and accountability of how this important public function is conducted—including the software used and how it functions, the criteria used for classifying images, the stakeholders who are consulted, and the processes for ensuring that over-blocking does not occur.

Because of these deficits, certain lawful content might be censored that should not have been—but that's the least of our concerns. A bigger concern is that the censorship practices established in the domain of child protection are regularly used as a precedent for broader regimes of censorship of speech, including political protest, art, and the speech of minority groups.

Although the transparency and accountability gaps of INHOPE members and partners are widespread (and we will touch on some of them below), some good practices do exist. In our most recent podcast interview, we discussed with Chair Andrew Puddephatt OBE how the Internet Watch Foundation has been setting high standards in this area. Puddephatt mentioned, for example, that some other INHOPE members accept hotline reports without any human verification that the images reported are actually illegal (you can watch the full interview below).

We attended the INHOPE Summit—or attempted to—to gain clarification on some of these issues. How many hotlines process images without human verification? How many hotlines accept reports of cartoon images depicting fictional minors, that don't involve the abuse or exploitation of any real child? How can stakeholders participate in the development of "baseline" factors used to classify images that are deemed illegal worldwide? How uniform are the national baseline standards applied by INHOPE members, and where are these standards published?

In the end we were unable to address any of these questions to the meeting, because a NCMEC representative complained about our Executive Director's live tweeting of the event, even alleging that one tweet was defamatory. Unfortunately this isn't an isolated incident; our previous repeated attempts to communicate with NCMEC about their policies and practices, dating back to September 2018 (the month after our launch), had also gone unanswered.

If a meeting is intended to be held under conditions of confidentiality, it's simply good practice to make that clear at the outset. Another option—the one that we adopted for our #SexContentDialogue last month—is to allow the meeting participants to quote from discussions at the meeting, but not to identify the speaker; this is called the Chatham House Rule. In Violet Blue's recent report for Engadget about our meeting, she expressed surprise that we had allowed press to attend our meeting at all, given that "usually such meetings… are kept behind closed doors."

No such stipulations were made about the INHOPE Summit, and given the public importance of the issues being discussed, we took it as our duty to report key points made at this meeting. So it came as a surprise when our representative was asked to leave the meeting over a tweet summarizing the presentation of a case in which a teenager was imprisoned for 31 years without parole for abusing his siblings. The tweet was deleted on request as a courtesy, although legal advice that we have since obtained indicates that there was no substance to INHOPE's allegation that it was defamatory of the speaker.

Other organizations in this sector have treated us in a similar manner. We have had no response to our repeated outreach to Thorn, an organization that Violet Blue recently investigated for its involvement in the surveillance of sex workers. Project Arachnid refused our request to evaluate the software that they use to automate the removal of material detected as being illegal.

As a Pro-Publica investigation revealed earlier this year, child pornography prosecutions have been dismissed due to the refusal of the secretive Child Rescue Coalition to give up details of its Child Protection System software, and our requests for access haven't been any more successful.

Google and Microsoft also maintain software used for filtering of illegal images (Google's includes a system that relies on artificial intelligence to identify new images of abuse). Although these systems are made available to other companies, our requests for access to evaluate these systems for their effectiveness and compliance with human rights norms have been refused, when they haven't been simply ignored.

To be clear, we are not singling out any one company or organization as being particularly intransigent here; the lack of transparency in this sector is a long-standing, systemic problem. The Internet Watch Foundation, which has been exceptionally open and helpful in their dealings with us, at least explained their reasoning for refusing us access to their URL blocklist and image hashes, "as both are heavily restricted due to the nature of the content contained within them."

But this reasoning doesn't stand up. Real images of child sexual abuse cannot be derived from image hashes. Neither are URL lists illegal in themselves; it is possible to scan such lists for mismatches without bringing up any illegal content. And when such lists have been leaked or reverse-engineered in the past, they have been found to contain completely innocent content.

And therein lies the problem: without independent scrutiny of these lists (even under a non-disclosure agreement, which we have indicated we are happy to sign), watchdog groups like ours have no way to verify that the sector is doing what it claims to be doing. We are not interested in naming and shaming actors, but we are calling on the sector as a whole to do better.

Our requests aren't arbitrary. We have been up-front about our intentions: we are asking for information and access to these tools because we are compiling an inaugural whitepaper on the transparency and accountability practices of Internet platforms and agencies when it comes to child protection.

Additionally, we have a practical need for access to these resources; we moderate a web forum of our own, and we partner with an independently operated support forum called MAP Support Chat. The latter had been previously censored by Internet company Discord, despite professionals attesting to the importance of such forums in helping to prevent abuse. Both of these are real-world environments for putting these systems and blocklists through their paces.

Our ejection from the INHOPE meeting today is the lowest point of what has been a poor showing of support from the sector for our evidence-based, human rights focused, and sex-positive approach. It also marks a point of divergence between the hotlines and the actual experts who are the strongest supporters of this innovative and inclusive approach.

At this point, the gloves are off. With or without cooperation from the large, government-funded Internet hotlines, we will continue to do what we believe is right, by shining a light into this largely unaccountable sector, and holding them to a higher standard of practice befitting the importance of the work that the public entrusts to them. If you believe in our work, you can click the button below to donate to our campaign to raise funds for the completion of our transparency whitepaper.

Donate now
#SexContentDialogue heads to RightsCon

On May 23, Internet companies, experts, and representatives from a broad range of impacted stakeholder groups came together in San Francisco for our Multi-Stakeholder Dialogue on Internet Platforms, Sexual Content, & Child Protection.

Reporting on the event for Engadget, journalist Violet Blue asked:

What have we learned today? Well, for one, the scorched-earth approach to sex censorship—FOSTA—is working about as well as the "war on drugs." For another, we should've known the war on sex was a lucrative growth market.

A full report of the San Francisco meeting is contained in the draft background paper for our follow-up meeting at RightsCon, that will be held on June 13 at 9am. You can read and comment on the draft paper on our new forum.

The paper also contains nine draft best practice recommendations about how Internet platforms should deal with the moderation of sexual content, in the face of complaints that such content sexualizes children. We'll be writing more about those principles after they are presented at RightsCon, but you can get a first glimpse of them now—and if you have thoughts you'd like to share, we'd love to hear them.

It's important for us to be at RightsCon so that we can present our singular vision of a rights-centered approach to child protection with an audience of 2500+ business leaders, policy makers, general counsels, government representatives, technologists, and human rights defenders. But getting there is going to send us further into debt. So we're asking you as a valued supporter—could you donate a few dollars to our campaign to cover out travel expenses? You can do that by clicking the button below.

Donate now
Latest blog posts
Ageplay is for adults
Hi, my name’s Meagan, but my Daddy calls me kitten! I love rainbows, and unicorns, and my kitty cats, and my stuffies, and arts and crafts, and reading stories and…
Read more...
Sex education makes people safer
I didn’t set out to teach sex education. I intended to be a high school English teacher, and that’s what I became. I moved overseas, taught English, and I loved…
Read more...
New Prostasia web forum

After a long wait, Prostasia Foundation is proud to announce the return of our web forum. In our September 2018 newsletter we explained that we had deactivated our previous forum due to a privacy bug, that could have enabled the identities of forum participants to be revealed (though we were able to confirm that this had not actually happened).

This problem has been addressed in our brand new forum, which operates on an independent software platform, not linked to our website or to our membership database—although paid-up members do receive an invitation code that gives them access to an exclusive member-only area.

Currently we are consulting on our forum about the outcomes of #SexContentDialogue, including nine best practice principles that we will be presenting for community feedback and further work at our RightsCon 2019 session. But there are also threads about our new mascot, our guidelines about terminology use, and the lobby groups responsible for censorship legislation.

Come register on our forum and join the conversation—we look forward to seeing you there! 

Visit the forum
Prostasia Foundation
18 Bartol Street #995, San Francisco, CA 94133
EIN 82-4969920
Twitter LinkedIn Tumblr Youtube Instagram
Modify your subscription    |    View online