Sex, censorship, and transparency
When you encounter an image of a naked minor, what feeling does it evoke? It depends, of course, on the context. If the child is your baby, and it's an image you took of them in the bath, it may evoke the feeling of parental love. If it's a press photo of a naked child on a battlefield, it evokes feelings of horror and anger. If it's a portrait in an art gallery, it might evoke aesthetic appreciation. If you're 18 and you have a 17 year old partner, it might evoke your sexual interest. If you're the 17 year old, it might evoke pride in your body, or shame.

But there are certain images whose distribution is unmistakebly an act of child sexual abuse, because they depict a minor sexually and are being distributed at large - something to which a minor cannot give consent. Prostasia Foundation supports existing laws that censor such material, including by making its distribution by adults a criminal offense. We also support the industry best practice of including such verified illegal images in hash databases, so that they can be voluntarily filtered out from online services, thereby reducing the ongoing damage suffered by the child.

However, because this is something of a "nuclear option" that we don't apply to any other sort of content and wouldn't accept in any other context, it's important that its boundaries are very clearly delineated, and that the participants in this process are scrupulously transparent and accountable for their roles in it. Otherwise, there is very real potential for automatic censorship of content to be extended to areas that don't involve protecting children from actual sexual abuse, and that may cause harm to innocent people's rights.

Today Prostasia Foundation joined other human rights organizations in expressing our concern about a new proposed EU Terrorism Regulation that would create new obligations for Internet platforms to censor content that they have identified as terrorist material, based on a database of hash signatures. The playbook for this regulation comes from the way that we handle child sexual abuse content, and it too began as a voluntary practice. But the signatories to today's joint letter express concern about the transparency of the database and the accountability of those who use it—concerns that will only be amplified when this voluntary practice ascends to the status of law.

Prostasia Foundation is the only child protection organization voicing similar concerns about the censorship of (what ought to be limited to) child sexual abuse material. Currently, we have to take it on trust that the institutions who maintain this censorship infrastructure are doing so in good faith. But trust is running in short supply among the sexual minorities who have been censored on platforms like Tumblr, Craigslist, Medium, Discord, and countless other platforms for posting sexual discussions, art, and fiction. Those who are censored online ought to be able to know: what policy did we infringe? Who developed this policy? How can we argue for it to be changed?

Even most of the images that are reported to the UK's Internet Watch Foundation (IWF) as being child pornography (about 65%, by its own figures) are ultimately found not to be illegal. The IWF, by and large, does a magnificant job of differentiating the two, which is why we are so privileged to have its Chairman, Andrew Puddephatt, as one of the participants in our upcoming Multi-Stakeholder Dialogue on Internet Platforms, Sexual Content, & Child Protection. But given the manifest inaccuracy of sexual content censorship algorithms in the wild, "trust, but verify" is surely a fair standard.

To be clear, the problem isn't always that platforms are censoring too much. There are legal images hosted on platforms like Instagram that some consider ought to be prohibited by its terms of service because they are sexually exploitative of minors. There are also websites that you might think a reputable Internet company would refuse to host or link to, because of the way in which they depict children engaged in innocent activities such as modeling and nudism. These websites aren't included in voluntary industry blocklists, and some may argue that they should be, and we'll be talking about some of them at our May meeting.

Rather, the problem is the lack of consistency in platform policies, and that in turn that flows from their lack of transparency about those policies and the technologies that are used to enforce them. To guard against over-censorship or under-censorship, we simply want to be able to know where the line between what is permitted and what is not permitted is being drawn, and why. The law does not always draw that line in an obvious place (for example, the 17 year old's selfies mentioned above may be illegal), but at least we have a degree of certainty about where that line is. In the case of censorship of images of sex and nudity by Internet platforms, we ought to have similar certainty, and unfortunately, we don't.

For example, for years Google allowed pro-contact blogger Amos Yee to build a large YouTube following advocating sex between adults and children, yet it has censored the Wikipedia article for "lolicon" from its search results (lolicon is Japanese cartoon art featuring underage girl characters). Microsoft displays a stern warning when you search for the phrase "child pornography" (as if anyone with ill intent would do that), yet when you search for another two-word search phrase that is associated in the professional literature with child abuse material, the top search result leads directly to unlawful and abusive images.

This shocking and unwanted discovery came about through our "CP" campaign, a primary prevention campaign aimed at warning users away from accessing unlawful sexual images of minors on the dark web. As part of our background research for the campaign, we discovered some of the inconsistencies in platform policies that allow such images to be easily accessed even on the open web. The first was that the Yandex search engine was being commonly used to access child sexual abuse materials, because it had failed to adopt industry best practices for downranking such material.

But to our alarm we more recently found similar inconsistencies even on the major platforms: Facebook, Microsoft, DuckDuckGo, and others have left open obvious means of access to unlawful images, even at the very same time as legitimate speech about sex is being censored, and new technologies are being rolled out that could infringe the freedom of expression and privacy of innocent people. You can argue about where some of the lines of sexual censorship should be drawn. But it should be possible for everyone to agree that the process should at least be accountable and transparent.

The discovery of apparently illegal highly-ranked content on major search engines, in response to obvious search terms, took us by surprise. Was this a mistake? A policy decision? A bug? To find out how to address cases like this, we would need to have access to the lists of search terms that platforms use in their algorithms, and to the industry-shared software used for filtering. The IWF maintains one such list used to uncover child sexual abuse content (which they previously described as "paedophilic content", until we pointed out the inaccuracy of that term). Thorn and the National Center for Missing and Exploited Children (NCMEC) also maintain lists of keywords, URLs and image hashes. Microsoft, Google, and the Child Rescue Coalition (CRC) maintain software for utilizing these lists and hashes for quickly identifying unlawful content.

But each and every one of these organizations that replied to our enquiries refused to provide with access to these resources for evaluation of their effectiveness and their compliance with human rights norms. We will be noting their responses in an upcoming Internet platform report this year. But if you don't mind spoiler alerts, here's the bottom line: the public's inability to know how and why legitimate sexual speech is being separated from unlawful sexual images of minors constitutes a major deficit in the transparency and accountability of the Internet industry's largely self-regulatory approach to child sexual abuse prevention. It's hurting children, and it's hurting human rights at the same time. It has to change, and we are here to hold them to account.

New Prostasia Podcast/Vodcast
This month we launched a new monthly podcast/vodcast series titled Sex, Human Rights, and CSA Prevention, which is available on YouTube as a video playlist, and on all major podcasting platforms including Google and Apple. Each month we will talk to a different expert about their experiences with a different aspect of child sexual abuse prevention. For the first month, our very special guest was David Prescott from NAPN, the National Adolescent Perpetration Network.

Listen Now
If you enjoyed listening to David's interview, you will be glad to know about a new benefit for Prostasia Foundation members. You are now entitled to receive discounted membership of NAPN. If you are a member, contact us for the discount code that you can use on the NAPN website to obtain a 50% discount off their annual subscription (valued at $99). The best time to do this is now, when you are still able to obtain earlybird pricing for the annual NAPN Conference on 2-4 May.
Our fight against FOSTA
Prostasia Foundation is working with a coalition of other human rights and sexual rights advocates, educators, writers, and other professionals to fight back against FOSTA, the law that is responsible for a lot of the recent over-censorship of sexual speech. We are preparing an amicus curiae (which means "friend of the court") brief to support the plaintiffs in the Woodhull Freedom Foundation lawsuit to have FOSTA declared unconstitutional. The current status of that case is that it was dismissed on procedural grounds, and that the plaintiffs have appealed that dismissal to a federal appellate court. It will be important for our brief to demonstrate to that court how FOSTA has resulted in the censorship of child sexual abuse prevention information. As an organization that depends entirely on membership fees and donations to fund our operating expenses, your support for our FOSTA brief is vital and appreciated.
Donate to our FOSTA campaign
 
Recent blog post
Is censoring online porn the best way to keep children safe?
If we want to make sure children are safe online, let's stop them watching porn. Does that sound like a sensible proposal? Or even a realistic one? While, on reflection,…
Read more...
Become a member
Please join Prostasia Foundation as a member to support our important work

Click Here
Prostasia Foundation
18 Bartol Street #995, San Francisco, CA 94133
EIN 82-4969920
Twitter Google+ LinkedIn Tumblr Youtube Instagram
Change your subscription    |    View online