Prostasia Foundation Protecting children by upholding the rights and freedoms of all
How the war against child abuse material was lost

The battle to purge child abuse images from the Internet has been lost. That doesn't mean that we can't or shouldn't continue to work towards the elimination of image-based abuse. But it is widely acknowledged by law enforcement, reporting hotlines, and prevention groups alike that this can't be achieved merely by censoring images from the Internet and by criminalizing those who access or share them—which are the only strategies that society has focused on until now.

Completely censoring abuse images from the Internet has proved intractable because it would require the surveillance of all communication channels, including those that are end-to-end encrypted. It simply isn't possible for communications to be both securely encrypted and also to be mass-scanned for child abuse images, and even the proposed Lawful Access to Encrypted Data Act wouldn't require such a gaping backdoor to be installed in secure communication apps and services. Even if such a mandate were put in place, free, open source encryption software is now ubiquitous. Secure communications are here to stay.

As for criminalization, at some point higher penalties for image-based abuse no longer have any further deterrent effect—and we reached that point long ago. Under existing state and federal laws in the United States, those convicted of possessing abuse images can easily receive a longer sentence than those convicted of the hands-on sexual abuse of a child. Yet as penalties for possession offenses have skyrocketed, rates of offending have increased along with them. Criminalization also has lifelong harmful effects on families and communities. Up to 15% of offenders are children themselves—and in some cases the victim and the perpetrator are one and the same.

Governments, nonprofits, and tech companies have failed

Responsibility for the harm that children suffer through the creation and circulation of abuse images lies solely with those who create and circulate them. Those harms are real, and we can't simply ignore them. But responsibility for our failure to contain this crisis lies with those who have been entrusted with the responsibility to do so. Since governments, large child safety nonprofits, and technology companies have all doubled down on the two-prong approach of censorship and criminalization, they all share the blame for its failure.

For governments, the emotional topic of child sexual abuse is routinely invoked to justify repressive laws and policies that could never otherwise secure passage. An example is FOSTA/SESTA, a law originally promoted as a solution to child sex trafficking, but which in fact targeted adult sex workers for criminalization and censorship. Aside from the harm that it did to sex workers—which can hardly be regarded as an unintended consequence—the law has also made the investigation and prosecution of real child sex trafficking cases more difficult than before, and resulted in censorship of content about abuse prevention.

A censorship-first approach has also been promoted by the large child safety nonprofits. NSPCC, the government-chartered child safety group from the United Kingdom, has been the driving force behind a campaign to hold Internet companies responsible for child sexual abuse. Its American counterpart NCMEC, which is also government-linked, was a key supporter of FOSTA/SESTA. It now also supports the EARN IT Act, a law that would even further expand the censorship of sexual content, but which is opposed by child sexual abuse prevention groups.

Technology companies have long borne the brunt of demands from governments and their allied child safety groups to adopt their censorship agenda. Over the past decade, they have increasingly capitulated by co-opting and partnering with these pro-censorship groups. In 2017, Facebook was the first tech company to join NCMEC in supporting FOSTA/SESTA. Most of the tech company representatives at this month's summit of INHOPE , the association of abuse reporting hotlines, are also alumni of government-linked groups—Twitter's from NCMEC, and Google's from the NSPCC. In short, large tech companies have not offered an effective check on the government's agenda, but have swallowed it whole.

A lesson from the music industry

Napster, the original peer-to-peer music sharing app that was released in 1999, created a massive headache for the music industry when its revolutionary model of music distribution led to an explosion in copyright infringement. The industry's initial response to this was exactly the same as the response that society has taken to the problem of child abuse imagery—censorship and criminalization. Indeed, many of the same underlying technologies are used to censor child abuse images as those that were developed to control digital content piracy.

But the industry soon learned that these approaches didn't work, and that in some ways they made the problem worse. Consumers resented being treated as criminals, incorporated music piracy as part of the counter-cultural identity of their generation, and as soon as one file sharing app was shut down, they moved to another. In the movie The Social Network, when Napster founder and Facebook investor Sean Parker claims that Napster “brought down the record company,” Mark Zuckerberg objects, “Sorry, you didn't bring down the record companies. They won.” Parker responds, “In court.”

Eventually, the music industry came around to the idea that they needed to compete with Napster on its own terms, by providing a better, equally convenient alternative. When they finally did so by licensing affordable music streaming and downloads, the piracy problem largely went away by itself.

What we should be doing instead

The government-linked child safety sector and its tech allies have yet to reach the same realization as the music industry. And so they persist in the idea that ever-tougher criminal penalties, combined with the increased surveillance that would be required to make these practical to enforce, will eventually be sufficient to eliminate abuse. But after more than 20 years of this experiment, it's finally time to call it a failure. If we continue down this path, things aren't going to get better; they'll continue to get worse.

To actually make progress towards solving the problem of child abuse online, we need to do what the music industry eventually did: we need to build a better pathway for people who are drawn towards it. Erecting border walls and surveillance posts around the Internet sends the wrong message to these people, and will only encourage them to circumvent these measures. Rather than trying to ensure that abuse images can't be accessed or shared, instead we need to focus on ensuring that there are better alternatives, so that fewer people feel the need to seek those images out.

Convincing people that viewing and sharing such images is harmful and wrong is a necessary and important part of achieving this outcome. But as with the drugs war, “Just say no” goes only so far— the allure of the taboo is palpable. And as police are now realizing, this allure can extend even to offenders who aren't otherwise sexually attracted to children (in other words, those who don't fit the psychological profile of pedophiles). In short, there are a lot more people willing to perpetrate image-based abuse than even experts previously believed.

What does a better alternative look like for these people? In the broadest sense, anything that could prevent them from abusing a real child should be considered as a viable alternative. In some cases this just means education so that they realize their behavior comes at a cost to children: viewing abuse images is not a victimless crime, and many people who offend still don't understand that. Once they do understand it, that provides enough incentive for them to stop. Others may require peer or professional support to make that connection and to adjust their behavior accordingly.

For others still, it may also help them to be able to explore their taboo thoughts and feelings through victimless outlets such as art, fiction, role play, or sex toys. These aren't an indication of a problematic sexual interest in specific individuals, but in the broader population some users of these materials may be doing so as a coping mechanism. Prostasia Foundation is the only child protection organization raising funds for research on whether such outlets could be a tool in diverting these people away from offending against real children, as initial research suggests may be the case.

Active opposition to alternatives that could prevent offending

Far from promoting or supporting research into such alternatives, the government-linked child safety groups actually want them banned. Historically, many of these groups were associated with the false Satanic sex panic that was a precursor to QAnon, and still today they remain intolerant of sexual minorities and of sexual expressions that are commonly (and wrongly) stigmatized. Their stated justification for criminalizing such expressions is that they are linked with real child sexual abuse—however there is no evidence supporting this claim.

The NSPCC, for example, rails against 18+ pornography and sex dolls that are too “young looking.” NCMEC allows those reporting child abuse images to include anime, drawing, cartoon, virtual or hentai images, and includes these and other lawful images in reports to foreign police forces. The Canadian Center for Child Protection, which does the same, once reported a 17 year old Costa Rican girl over cartoon images that she posted online, resulting in her arrest by authorities. The reporting hotline association INHOPE has refused to put an end to these practices.

Due to their close partnership with these groups, governments and tech companies have fallen into line behind their stigma-driven policies. During 2019, a raft of laws banning sex dolls were passed in the United States and overseas, despite ongoing research into their therapeutic applications. Under pressure from a stigmatizing press report initiated by a conservative activist who represents the NSPCC and other British groups, Facebook cracked down on the adult DD/lg lifestyle community. Other tech companies have been making similar censorship moves; Reddit, for example, banned almost twice as much content for “minor sexualization” in 2019 than in 2018 due to an expansion of its policies and enforcement practices to include fictional content such as 18+ ageplay and manga art.

To be clear, this means that not only are governments and big tech companies failing at addressing the misuse of real child abuse images through their blinkered preoccupation with criminalization and censorship, but by extending this censorship to lawful and possibly therapeutic outlets for some people who might otherwise be drawn to illegal content, they could actually be making the problem worse. Additionally, by establishing a precedent that content should be banned because it is immoral, rather than because it is harmful, they have played into the hands of those whose agenda includes banning other “immoral” content such as 18+ pornography.

Independent platforms are leading the way

It may be too late to disentangle large tech companies from the puritan agenda of the government-linked censorship cartel. At least for now, that agenda is being fought elsewhere, such as in the Supreme Court, where FOSTA/SESTA remains under constitutional challenge. But in the meantime, our hope for a more evidence-based approach to the prevention of online child abuse lies with smaller platforms.

For example Fanexus, a soon to be launched social media platform for fandom communities and creators, and Assembly Four, a sex worker and technologist collective that operates platforms for sex workers and their clients, are both dedicated to providing censorship-free spaces for their respective communities online. But at the same time, they are also working proactively to ensure that these platforms are not misused to perpetrate the abuse or exploitation of minors.

Because so much attention has been devoted to censorship and criminalization as the solution to child sexual abuse, we are still navigating the contours of a more prevention-focussed approach. Legally and technically, what can be done to limit the availability of unlawful images of minors online, and what can't? How can platforms moderate content without resorting to a checkbox approach that embeds harmful stereotypes and assumptions? What does safeguarding look like, for platforms that allow fictional content that references child abuse? What content warnings are sufficient for such material when it may be triggering for survivors?

These are questions we must now engage with seriously, although answering them will require an investment in research, and a willingness to engage with stigmatized topics and communities rather than sweeping them off our platforms and into darker corners of the Internet. The importance of this investment has been overlooked for so long because many people falsely believe that prevention isn't possible—but it is.

We can effectively work towards the elimination of image-based abuse, but not through mass surveillance and censorship or by further enabling the expansion of the carceral state. Instead, we'll solve it step by step as more individuals who now resort to the use of unlawful sexual images of minors decide for themselves that a better alternative exists for them… and as generations to come follow in their footsteps.

We look forward to the day when we can call the battle against image-based abuse a success, and we invite you to join us in fighting it.

Please donate
Recent blog posts
Dress codes: sex and stigma
I was in middle school when the spaghetti strap phase of the late 90’s hit (yes, I’m old). Basically, all tank tops that came out for a year or two,…
Read more...
A social worker’s role in child sexual abuse cases
When we talk about child sexual abuse in the United States, the focus is too often exclusively on laws and punishments, as opposed to social work and healing. To this…
Read more...
Sex education at all ages

Meagan Ingerman talks with August McLaughlin, the author of Girl Boner and host of Girl Boner Radio, about her work in sex education and why it's never too early to answer childrens' questions about sex.

Review: Girl Boner by August McLaughlin
Reviewed by Meagan Ingerman

Reading Girl Boner: The Good Girl's Guide to Sexual Empowerment by August McLaughlin is like having a delightful chat with a friend who also happens to be super knowledgable about sex. While handling sensitive subjects with care, McLaughlin is also unapologetic in her candor about sex and relationships. Girl Boner is a great place to start, a good refresher, and a compelling story. 

Because of a lack of comprehensive sex education, sex and attraction can be a real bugaboo for a lot of people. Girl Boner is the answer if you are looking to dip a toe into a friendly pool. The author even notes that the book can be read out of order for the sake of reading the chapters you need the most. 

The book is an easy read in that it moves along at a reasonable pace and the storytelling feels authentic. So does the earnestness behind the words. However, it’s worth noting that the book has hard to read parts. There are challenging themes like sexual assault, abuse, and mental illness. 

Content Warning: Eating Disorder

Aside from teaching about sex, Girl Boner explores McLaughlin’s past and what led her to her current position as author, podcast host, and sex educator. One of the harder to read parts of the book is where she details the experience of having an eating disorder, the destruction it can cause in one’s life, and how hard but rewarding healing can be. 

If you suffer from an eating disorder or have in the past, you may want to skip this section. If you can read it safely, and I say this as someone working on healing from the same thing, this section is worth reading. It is relatable in a way most writings will never be for those of us who have struggled with eating disorders and it ties into other important themes like sex education and sexual empowerment.

End Content Warning 

Girl Boner makes an effort to be as inclusive as possible while still talking about sex and what gets us off. As a sex educator, McLaughlin has some awesome advice, exercises, and perspective on sex, liberation, and pleasure. She covers a wide range of sex and masturbtion related topics while making them approachable for any level of sexual experience or lack thereof. The book includes journaling prompts for those who want to write about their own journey. 

The one area where McLaughlin and I differ is in the area of porn and porn addiction. Even so, her arguments and commentary are generally fair and balanced even where I don’t entirely agree. I appreciate that the book does not present the subject as a one size fits all situation and makes room for differing opinions. Mostly it feels like the author wants you to be safe with yourself and I can totally get behind that. 

All in all, I highly recommend this book to a wide range of people seeking information and liberation regarding sex and pleasure. Wherever you’re at, this book will meet you where you are. Girl Boner is all about finding what works for you and giving you the education to experiment safely.

Facebook Twitter LinkedIn Tumblr Youtube Instagram
Prostasia Foundation
18 Bartol Street #995, San Francisco, CA 94133
EIN 82-4969920
Modify your subscription    |    View online