Prostasia Foundation Protecting children by upholding the rights and freedoms of all
Encryption and CSA prevention

Just over one week ago, the New York Times published a major investigation into the intractable problem of illegal sexual images of minors being exchanged online. Despite flaws in the story and its companion pieces, the main take-away that Internet companies have failed to adequately address the problem has resonated widely.

We too have been critical of some of the Internet platforms called out in the article. But at the same time, we need to be realistic about how much responsibility we can (or should) place on tech firms to solve this problem. In a previous newsletter, we wrote that, “the large platforms have already done most of what can be done to prevent the sharing of known illegal images, by ensuring that images are scanned against a database of illegal content before they can be uploaded or shared.”

Does this mean that they couldn’t do more to prevent online child sexual abuse? Certainly, they could. They could implant spying tech into your web browser, so that it checks every image that you load and every website that you visit. The microphones in your home devices could become always-on bugs that listen for sounds from child exploitation videos. A back door could be added to encrypted messaging apps, opening up your communications to government surveillance.

The New York Times investigation has pushed the last of these ideas back into the political spotlight. Last Thursday, law enforcement and national security officials from the United States, the United Kingdom, and Australia wrote a letter to Facebook warning it to hold off on its plans to add strong encryption to its Facebook Messenger app. The following day, Deputy Attorney General Jeffrey Rosen reiterated those demands at a summit at FBI headquarters.

The relevant question here isn’t whether tech companies could do more to intercept child sexual abusers; of course they could. The question is whether they should. If there is any use case that could justify such intrusive surveillance, the fight against child sexual abuse is it. But there are limits to what we allow governments and private companies to do, even in the pursuit of an important objective such as investigating crime. Human rights law sets those limits, and the right to communicate privately is one of them.

That’s why dozens of civil society groups, including Prostasia Foundation (the only child protection organization among them) pushed back against the governments’ demands to Facebook in an open letter that we released last week, stating, “default end-to-end security will provide a substantial boon to worldwide communications freedom, to public safety, and to democratic values, and we urge you to proceed with your plans to encrypt messaging through Facebook products and services.”

It’s easy to see why this isn’t an entirely satisfactory answer for some, because it seems to suggest that we should just give up in the face of the horrible crime of online child sexual abuse. But that’s not true at all; it simply means that we need to find other, better methods of addressing the problem. For example, rather than attempting to outlaw strong encryption, perhaps we could actually leverage encryption to promote child protection, as part of a broader primary prevention approach.

That’s the basis for Prostasia Foundation’s concept for a project that would utilize the strong encryption and anonymity that underpins the Tor network to provide information and support resources to those who are at risk of offending. As we describe in our concept note, this project “will demonstrate our rejection of the narrative that the strong encryption technologies that enable privacy and anonymity online are incompatible with child protection.”

The difference between an approach prioritizing the detection and prosecution of offenders, and our prevention focused approach, is the difference between viewing child sexual abuse primarily as a crime, or viewing it primarily as a public health issue. Many of the circumstances in which minors suffer sexual harm don’t fit well within a criminal law frame—for example, a majority of new illegal images of minors are selfies, and about a third of perpetrators are minors themselves. That’s why framing child sexual abuse as a preventable public health problem enjoys increasing support among experts. Confoundingly, however, the funding dedicated towards prevention initiatives is a tiny fraction of the amount dedicated to the carceral approach. This has to change.

If we really want to prevent people from accessing images and videos of child sexual abuse, we need to get over the idea that controlling the channels by which those images are exchanged is a viable solution to the problem. We aren’t going to be able to stuff the encryption genie back in its bottle. Hanging our hopes on that, and forcing tech platforms to cripple their products, takes the heat off our own responsibility to be part of a broader culture of the primary prevention of abuse.

Support this work
Latest blog posts
Image
What purity policing fans get wrong about censorship
Alexus is a 15 year old who writes fan fiction for fun. Her favorite fandoms include Supernatural, Legend of Zelda, and The Chilling Adventures of Sabrina. She is currently working…
Read more...
Image
Why the UN is wrong to equate drawings to sexual abuse
Sex or nude scenes featuring teenagers in anime and manga could disappear from the Internet and be driven into underground “dark web” sites, thanks to a decision equating art with child pornography.…
Read more...
Image
Banned Books Week 2019
Here we are again. It’s that time of year when we build our TBR piles against the better judgement of those who wish to censor literature and control thoughts. Yes,…
Read more...
Prostasia Foundation in Korea

Privacy is not the only human right that should guide the laws and policies that we craft to help combat child sexual abuse. Freedom of expression must also be taken into account in our response to this problem. This was the topic of last week’s event The Current Status of Freedom of Expression in the Republic of Korea, attended by Prostasia Foundation Executive Director Jeremy Malcolm, and chaired by our Advisory Council member Kyung Sin Park.

The final session at this event, and the most pertinent one for Prostasia, was titled “Between imagination and actions, and the role of the government—the issue of virtual child pornography and real dolls,” and it dealt with the trend towards the criminalization of art, fiction, and sex toys that are perceived as representing minors in a sexual context.

As explained in our blog post above, last month the United Nations Committee on the Rights of the Child made a misguided recommendation that countries criminalize such fictional representations as if they were real images of children being sexually abused. As the first panelist Kelly Kim explained, such a law has already come into law in Korea. It resulted in the number of indictments for child pornography offences increasing from 100 in 2011, to 2,224 in 2012.

Jeremy Malcolm spoke about how the United Nations recommendation had been driven by claims from child protection organization ECPAT that virtual representations are linked to the abuse of actual minors. He spoke about Prostasia Foundation’s attempt to raise money for research to test this claim, and also launched our No Children Harmed campaign, as described below.

The third panelist, economist Masayuki Hatta, explored how treating virtual representations as child pornography results in the misapplication of enforcement resources towards offences by people who will never abuse a child, as Swedish law enforcement officials testified when a manga collector was prosecuted in 2012. The fourth panelist, Ogino Kotaro, gave further examples of how law enforcement resources have been wasted against anime and manga fans, and suggested that experts should be consulted about proposed laws to criminalize these art forms.

Lawyer Hong-Seok Yang, who spoke next, addressed the difficulty of any legal test that turns on what constitutes a “representation” of a child. The Korean courts have ruled that it is unconstitutional to ban images of adults dressed in school clothes, and Yang argued that it makes no more sense to ban drawings that, objectively, have no real age at all. He was followed by writer and activist Sun-Ok Lee, who spoke out against the government regulation of sex dolls, arguing that human beings are capable of discerning the difference between an object that they use for masturbation, and the way they treat another human being.

The last speaker was Daniel Møgster from the United Nations Office of the High Commissioner for Human Rights—a separate body from the Committee on the Rights of the Child (CRC). He explained that a country seeking to follow the CRC’s recommendation to criminalize cartoons would inevitably be placing a restriction on freedom of expression, which would require it to pass a stringent three-part test: “it must be in pursuance of a legitimate aim, it must conform with legality, and it must be necessary and proportionate.”

Beginning with the “legitimate aim” part of the test, preventing direct harm to minors is the objective of laws that prohibit real images of children, and it is well accepted that countries are authorized, and indeed obliged, to do this. But for virtual images which cause no direct harm to a child, the calculus is more difficult.

Møgster identified that the CRC Guidelines proposed two separate objectives as a legitimate aim for its recommendation that virtual images should be banned. The first is to protect children from the risk of indirect harm—but he referred back to previous presentations, including Prostasia Foundation’s, which suggested that this risk was speculative and unsupported by evidence.

The second proposed objective, he said, was to uphold the “dignity” of the child. He drew an analogy to the objective of the Convention on the Elimination of All Forms of Discrimination Against Women (CEDAW Convention). This Convention (which the United States signed, but never ratified) was intended, over time, to modify the social and cultural patterns of conduct of men and women, with a view to the elimination of harmful prejudice and discrimination against women. As he explained:

Now this is a positive obligation on the part of the state not necessarily because there is one victim, but because there is this goal of gradually recognizing the inherent inherent dignity that women have in society as well. On my initial reading of the Guidelines, it seems that the other of the purposes, the dignity purpose, suggests that the depiction of virtual children… as objects of sexual desire, in and of itself, is something that might interfere with their human dignity.

Assuming that this does amount to a legitimate aim, there are still two more limbs of the three-part that would have to be satisfied to justify the CRC’s recommendation. As Møgster went on to explain:

Even if the state might have a legitimate objective for restricting this type of material, you still have the principle of legality, and as previously mentioned, there is a requirement that laws be clear, especially criminal laws. … The judiciary has struggled and the legislature has struggled quite a lot in order to try and define… what kinds of content would be covered and what kinds of content wouldn’t.

As to the final requirement of the three-part test, that a restriction on freedom of expression should be necessary and proportionate, Møgster explained that this meant that where there are several different ways for a measure to achieve its protective function, “the least restrictive measure among the alternative restrictions” must be adopted. For example, a ban on sex dolls could be accomplished either by customs laws, or by criminal law:

This also shows that the state has a variety of means… to react to different types of content. We’ve also heard that criminal sanctions are the strongest sanction that the state has. It is crucial for any imposition of criminal sanctions that it doesn’t overreach, and the state has the burden of proof to justify that any measure is in fact covering no more than what it needs to cover.

The upshot of this is that the legality of the recommendations of the United Nations Committee on the Rights of the Child have been cast into serious doubt. Even on the most charitable view, the outright criminalization of representations as diverse as written storylines, drawings, and dolls, will be very difficult if not impossible for countries to justify under international human rights law.

The fact that the Committee failed to acknowledge, let alone to address these manifest legal difficulties in its Guidelines means that they cannot be taken as a reliable guide for countries considering their approaches to child pornography law—which means that they have failed at the very purpose that they were intended to fulfill.

“No Children Harmed” launched

Prostasia Foundation believes there is a better approach to restricting the public dissemination of virtual sexual representations of minors—and that it doesn’t require government intervention at all. Rather than being based around censorship, this alternative approach is based on the promotion of voluntary best practices to ensure that nobody is exposed to such content without their consent, and that there are “no children harmed” in its creation or distribution.

That’s what our new No Children Harmed program is designed to certify. This new policy was soft-launched last week during our event in Korea, with five initial launch partners, and several more on the way (you can read about one of them below!). Websites, publishers, service providers, vendors, and event organizers, can use the No Children Harmed certification seal to certify that they are allies in the fight against child sexual abuse, and that they have taken reasonable steps to avoid harming children.

Zine review: PROBLEMATIQUE

One of the launch members of No Children Harmed is a new adults-only digital multi-fandom zine for charity titled PROBLEMATIQUE. Although its first issue remains under development, a sample issue is already available, and it sets out clearly what its team is attempting to achieve with this project.

The zine provides an excellent example of the approach that No Children Harmed intends to promote, in that although it contains challenging and, well, “problematic” content, the editors take care to ensure that readers are not exposed to this content without their consent.

This is done in several ways. First, both written and visual material is comprehensively and accurately labeled with content warnings, both in-place and in a separate index at the back of the zine. Second, the zine is also divided into three section: a SFW (safe for work) section, a NSFW (not safe for work) section, and a graphic content section. Third, a version of the zine with censored images is also expected to be made available, although such a version wasn’t prepared for the sample issue.

For example, the sample issue’s NSFW section contains a Supernatural and Teen Wolf crossover story, which is tagged for those fandoms, given a rating of “NSFW,” and contains specific content warnings such as “underage” and “dubious consent.” A more explicit Gravity Falls story is contained in the Graphic Content section and tagged as such, with content warnings such as “underage” and “incest.”

Certainly, such confronting content isn’t for everyone. But that’s the point of PROBLEMATIQUE, whose every issue will focus on a specific subgenre of controversial or taboo art and fiction. “Our only restriction on content,” write the editors, “is that it cannot contain depictions or descriptions of identifiable real-world people.” The zine’s designer Specs writes, “My main goal is to foster a safe, kink-positive community in fandom. … For art and stories, tags and content warnings work wonders in practicing informed consent. No (real) humans harmed!”

It’s unfair to review the sample issue as representative of the final product, because the media used weren’t commissioned especially for this zine. However, based on the sample issue we can expect a joyously colorful and cacophonous design aesthetic, a diverse catalogue of fandoms, and content running the gamut from sweet and funny through to defiantly crude. But we can also expect such content to be accurately labeled, and ethically produced and disseminated. Prostasia Foundation is proud to welcome PROBLEMATIQUE as one of the launch members in our “No Children Harmed” program.

Prostasia Foundation
18 Bartol Street #995, San Francisco, CA 94133
EIN 82-4969920
Facebook Twitter LinkedIn Tumblr Youtube Instagram
Modify your subscription    |    View online