Prostasia Foundation Protecting children by upholding the rights and freedoms of all
Internet reporting hotlines are censoring art

This week United States Attorney-General William Barr cited the need to address child exploitation as one of the factors motivating a mooted review of a law called CDA 230, which provides that Internet companies aren’t responsible for what their users say or do online. The are many dimensions to the problem of child exploitation, ranging from inappropriate comments on Instagram photos to child grooming on Fortnite... but the one that captures most public attention is the problem of illegal sexual images of minors being shared online.

CDA 230 has nothing to do with that problem, however. These images were never protected by CDA 230 to begin with—platforms were always obliged to take these down as soon as they were discovered. And platforms already have a robust way to remove them immediately and without human intervention—at least if the images have already been encountered and flagged as illegal at least once before. This is done using hash-scanning software such as Microsoft’s PhotoDNA, in conjunction with lists of image hashes contributed by the platforms themselves, or collected by third-party Internet hotlines for reporting child abuse images. When a user attempts to upload content to a platform that utilizes these image hash lists as part of its moderation process, the content will be blocked and the user will be reported to law enforcement authorities. In the United States, this is mandated by law.

So far, so good. But the big problem with this arrangement is: who makes sure that these hash lists only contain illegal content? Since the images are illegal to view, the platforms aren’t allowed to share them after reporting them. This means that there is a risk that lawful content, or content such as cartoons that aren’t images of real child abuse, will also end up on these lists, and that it will be impossible to get it off again.

In fact we know that this happens regularly, and that the Internet hotlines are complicit in it. When NCMEC, the United States hotline, receives reports to its CyberTipLine that relate to a user in a foreign location, it sends the complete report to the foreign police force, including the personal details (IP address) of the user, without any verification that the content in question is illegal. As a result, up to 90% of the images passed on by NCMEC are later assessed to be innocent. Robert Jones from Britain’s National Crime Authority has testified that the inclusion of these innocent images, including cartoons, is “not really what this regime is designed to detect.”

Canada’s Internet hotline Cybertip.ca, operated by the Canadian Center for Child Protection, has also been accepting reports of cartoons and forwarding them on to authorities. In May 2019, a 17 year old Costa Rican girl was arrested for posting drawings to her blog. The arrest came in response to a report passed on by Canadian authorities. When Prostasia Foundation contacted the Center about this, they refused to confirm or deny responsibility for the referral that led to the girl’s arrest. However, they did say:

While we do not share details on specific reports submitted to Cybertip.ca, it might be helpful for you to understand our process. Through Cybertip.ca, the public can report concerns about the online sexual exploitation of children. We forward any potential concerns to the appropriate law enforcement agency and/or child welfare. These authorities determine whether to proceed with an investigation.

Yet even flagging “potential concerns” is a determination of potential illegality which exposes the user to the risk of prosecution. It is natural that foreign police forces will take these “potential concerns” seriously, and it is disingenuous for the Canadian Center to wash its hands of responsibility. Unlike the case of real images of child abuse which directly harm children, the harm caused by drawings of child abuse is entirely subjective and much more difficult for a non-judicial agency to determine.

Since April 2010 the United Kingdom’s Internet Hotline, the Internet Watch Foundation (IWF) has been accepting reports of “non-photographic child sexual abuse images” such as cartoons, provided that these are hosted in the United Kingdom. Unlike the American and Canadian hotlines, it does not forward these reports to foreign authorities. The IWF has also confirmed to us that they do not currently add them to the image hash lists that it offers to its member Internet platforms.

In November 2019, the UK-based host of an art blog was arrested for hosting child pornography due to the blog’s inclusion of two panels of comic strips in a long and academic discussion about the line between legitimate art and child pornography. The panels in question were from a Ignatz Award-nominated semi-autobiographical comic Daddy’s Girl by Debbie Drechsler, and they depict her own experiences of incestual child sexual abuse. We have been unable to confirm whether the IWF was involved in the arrest of the man, who writes:

Do I have the right to tell Debbie that her work is now child pornography? It probably took her 20 years to drum up the courage to put her feelings down on paper and now 20 years on with the book still freely available some IT technician has decided that he does not like a picture from that book.

Cartoons that depict minors sexually can be offensive, and in some countries such as Costa Rica and the United Kingdom they are also illegal. Yet Prostasia Foundation’s position is that Internet hotlines ought not to be engaged in the business of censoring such art or recommending cases for the police to prosecute. As the censorship of Debbie Drechsler’s comic illustrates, there is no clear line between art that graphically depicts child sexual abuse, and “obscene” pornography.

If such a line is to be drawn, it should be drawn by courts, not by private, quasi-governmental organizations. These organizations’ special censorship powers enable content to be instantly eliminated from online platforms, and referred to the police, with little accountability or transparency. This is a “nuclear option” that we only tolerate because photos or videos that depict real children being abused are uniquely abhorrent and directly harmful, and must be eliminated quickly to minimize that harm. They are also relatively easily identified—either they depict a real child being abused, or they don’t. These same considerations do not apply to artworks, no matter how offensive they may be.

That doesn’t mean that there is nothing that can be done about art depicting minors that is potentially offensive or triggering. Prostasia Foundation’s No Children Harmed certification standard sets out some guidelines about how such artworks should be handled. Like all adult-oriented content, they should be hidden from minors by default (and from others who choose not to see them). But additionally, the standard requires members to “specifically label, flag, or mark any lawful content that depicts imaginary minors engaged in sexual activity or that depicts nudity of real or imaginary minors.”

Prostasia Foundation is calling on INHOPE, the international organization of Internet reporting hotlines, to instruct its members that they should not be treating artistic images as equivalent to photographs and videos of real child sexual abuse. Specifically, such content should not be added to hash lists to be automatically censored, and should not trigger reporting to law enforcement authorities unless required by the law where the hotline operates.

Let’s ensure that the vital work of eliminating real child abuse images from the Internet isn’t diluted into a moral war against art and artists. It’s too important for that, and the costs of getting it wrong are too great. You can help to send this message to INHOPE and its members by signing our petition.

Sign the petition
Latest blog posts
Image
We have to protect children — even the ones you hate
Over the last four decades or so—the course of my life—we have seen an ever-growing awareness of the effects of childhood abuse and, in particular, the effects of sexual abuse.…
Read more...
Image
Age gap relationships through the looking glass
When I was in my teens, I found myself attracted to people who were 25 or, preferably, older. As I aged, the range extended up a little and, since I…
Read more...
Launch of #SexContentDialogue principles

In May 2019, Internet companies gathered with policy experts and representatives of marginalized groups at #SexContentDialogue in San Francisco to talk about how to minimize the harm done to minorities in our fight to prevent child abuse online.

This led into the development of a set of principles that could be used to guide Internet platforms of all sizes to adopt a more nuanced and better-informed approach towards the moderation and censorship of sexual content, with a view towards protecting children from sexual abuse while also upholding their rights and the rights of others.

In June 2019, we took this conversation to the world at RightsCon in Tunisia, and we have been finalizing the principles since then. The result is a document that emphasizes that the most important priority in child protection for content moderation professionals is to minimize direct harm to minors—and that this doesn’t need to conflict with freedom of expression.

On November 18 at the Internet Governance Forum in Berlin, Prostasia Foundation launched version 1.0 of its #SexContentDialogue principles. at an event chaired by Kelly Kim from Open Net Korea.

The new principles suggest that Internet companies should prioritize the removal of content that directly harms children, such as child abuse images, but that they should avoid blanket restrictions on non-abusive sexual content, and should consult with minority groups who might be affected by such restrictions.

The principles also caution against reliance on artificial intelligence algorithms to decide whether sexual content is permissible, and encourage companies to spell out their rules on sexual content in detail.

These principles are now at version 1.0, but we're not done yet. We've already used them as the basis for development of our No Children Harmed certification, and on our longer term roadmap, we are planning on working with Internet content moderators to develop a set of template community standards for Internet platforms that embody these principles in a more operationally specific way.

Read the principles
Image
Your donations help those who need help
Prostasia Foundation has a specific fund dedicated towards helping individuals get help from professional therapists, when they feel that they could use additional support in managing inappropriate sexual feelings. On a discretionary basis, the fund will cover the cost of one initial consultation with a therapist selected by the client.

Donate now
Review: Hunting Warhead

Hunting Warhead is a six-episode podcast series from CBC/Radio-Canada and the the Norwegian newspaper VG, which covers the true story of the dark web child abuse website Child's Play. In the course of the series the site's administrator Ben Faulkner speaks from his prison cell. We also hear from a broader cast of characters who were involved in taking it (and him) down—hackers, police, journalists, psychologists, Ben's friends and family, and his victims.

For the most part the series leaves the listener to decide their feelings towards how the case was handled, and even offers a balanced (yet ultimately damning) treatment of Faulkner himself, describing his realization that he was a pedophile, his limited attempts to obtain help from an unsympathetic therapist, and his ultimate descent into criminal activity.

The value of this series is that it goes beyond condemning Faulkner for his evil crimes, and asks how those crimes could have been prevented. In the final episode, host Daemon Fairless narrates the story of Faulkner's final sentencing hearing and its aftermath. In the course of the series he had spent hours talking to Faulkner by phone, but eventually he decides that he can no longer take calls from the unrepentant criminal, saying:

It isn't simply because Faulkner is a pedophile. In the course of this series I have spoken with pedophiles who understand that acting on their desires is wrong. Who live in a state of perpetual self-loathing. Who, I have every reason to believe, will probably never abuse a child. As tempting as it can be to think of pedophiles as monsters, it's not accurate. They're people with an affliction we don't yet fully understand. Nor do we really know how to help them. More than that, maybe we don't really care to help them. So I wonder whether we maybe have a bit of reckoning to do, if we truly do want to protect children. That said, it's also true that there are people who do monstrous things. People who exploit and harm others simply to satisfy their own selfish desires. Who don't care what kind of fall-out they leave in their path. I think that's an accurate description of Benjamin Faulkner. Not "pedophile," not "master of the dark web," but a selfish, remorseless, sociopath.

Another remarkable aspect of this story is that for the majority of the time that the Child's Play abuse website existed, it wasn't being run by Ben Faulkner—for a full eleven months, it was being run by the Australian police. Not only did they allow the website to remain open for its thousands of anonymous users to upload and exchange images of child sexual abuse, but incredibly, the police themselves uploaded child abuse images under Faulkner's name, to perpetuate the illusion that the site was still under his control. Amnesty International and UNICEF regard this as a serious human rights violation by the police against the children depicted in the abuse images—but their actions were supported by mainstream child protection groups such as ECPAT.

Hunting Warhead is difficult listening at times, but it is worthwhile diving in and listening through to the end. It leaves the listener convinced that there must have been a better way of preventing Ben from following the path that he did. As such although it is a harrowing story, it is ultimately also a hopeful one.

Prostasia Foundation
18 Bartol Street #995, San Francisco, CA 94133
EIN 82-4969920
Facebook Twitter LinkedIn Tumblr Youtube Instagram
Modify your subscription    |    View online