The Internet industry are the bad guys, because they are not doing enough to stop child sexual abuse. And child protection groups are the good guys, because they are the ones who are forcing the government to take tough action against “big tech.” Or so the story goes, and most members of the public, not knowing any better, have bought into it. But the reality is that it’s all for show.
The large child protection groups use the politically popular rhetoric of taming big tech because this aligns them with the government. Governments are already predisposed towards tightly regulating social media platforms, whose position in public discourse has become a serious challenge to governmental authority.
But in reality, those who works in the field of child protection know very well that Internet companies don’t hold the keys to eliminating child sexual abuse. To do that means taking a much more community-oriented approach towards child sexual abuse called primary prevention. That’s the approach that Prostasia Foundation is dedicated towards.
After all, nine out of every ten cases of child sexual abuse are committed not by strangers on the Internet, but by someone the child knows and trusts. And even when it comes to the sharing of images of child sexual abuse, fewer than 1% takes place on social media according to the Internet Watch Foundation’s latest Annual Report.
This isn’t to say that Internet platforms don’t have an important role in preventing abuse. But the large platforms have already done most of what can be done to prevent the sharing of known illegal images, by ensuring that images are scanned against a database of illegal content before they can be uploaded or shared (a policy that we support).
Beyond this, pushing platforms to do more is a game of decreasing returns—and escalating risks to civil liberties. The latest ideas being floated include some profoundly dangerous ones, such as using artificial intelligence (AI) algorithms to scan users private conversations and photographs, and embedding spying technology inside your web browser.
Prostasia Foundation doesn't mind exposing this game, because we refuse to play it. By treating child sexual abuse as a problem that only government regulation of the Internet can solve, we ensure that it never will be solved. We have seen how laws such as FOSTA, that politicians claimed would force platforms to protect children better, actually do anything but that, and instead result in significant chilling of constitutionally protected speech.
Heavy-handed government intervention also hamstrings Internet platforms who could do a better job of addressing the child sexual abuse with a more light-touch, flexible approach. After all, although Internet companies have a social obligation to protect children, this coincides with their private business interest in curtailing the misuse of their platforms, and that’s why they have been the primary innovators in child protection technologies.
However, that’s not to say that Internet companies can be trusted to make the right decisions independently. All of the participants in the Internet ecosystem, from social networks to payment processors, are heavily lobbied by sex-negative morality police, peddlers of anti-porn pseudoscience, and those with an open anti-LGBT and anti-sex worker agenda. It’s no wonder that this tends to result in legitimate speech about sex—including child sexual abuse prevention—being shut down.
So what do we suggest? We suggest opening up the conversation, and allowing other voices to be heard. First and foremost, we think that platforms should be listening to scientists. We cannot address the problem of child sexual abuse without thoroughly understanding what motivates certain people to offend—and what could motivate them not to do so.
Internet companies also need to be speaking to those who are negatively impacted by the over-censorship of sexual speech. This must include sex workers, the community that FOSTA attempted to erase. It should include artists and fans, and the volunteer moderators of their communities. It should include survivors of child sexual abuse, and those who have perpetrated abuse and are on a path towards rehabilitation. It should also include the adult entertainment industry and the consensual kink community, who share a common objective of exercising their sexual freedom without endangering minors.
Until now, these marginalized and excluded communities have had nobody in the child protection community to speak up for them, and Internet companies have had no way of hearing their voices. But on May 23, this will all change as Prostasia Foundation hosts a unique event: a Multi-Stakeholder Dialogue on Internet Platforms, Sexual Content, & Child Protection.
At this event, participants will begin by hearing from the top academic and industry experts on what Internet companies should know about child protection. Then in the afternoon they will apply this knowledge in a series of case studies, that will take them on a deep dive into some complex, grey areas:
Lolicon, shotacon, and furry fandom. Lolicon and shotacon manga and furry cub art are cartoon artforms that have recently been banned on platforms like Discord, Twitter, and Reddit on the ground that they sexualize children. Is this the right place to draw a line?
Nudist websites that cross the line. We can accept that nudism (naturism) is a legitimate lifestyle and that families do participate in it. However, nudist community organizers need to take responsibility to make sure that children are not being exploited. How can Internet companies work with this community?
MAPs and registered citizens on social media. Several platforms disallow those who identify as Minor-Attracted Persons (MAPs) to post their thoughts online because it offends their other users, while others place a blanket ban on users who are on registries of sex offenders. Do these policies make sense?
Child modelling. There is range of legal but ethically dubious child modeling content online, from mainstream platforms such as Instagram, through to websites that feature content from modeling studios associated with underage nude modelling. Whose responsibility is it to ensure a child model’s welfare?
DD/lg and ageplay. The sexual fetishes of DD/lg (Daddy Dom/little girl) and ageplay are immensely popular, especially with younger women. Some online representations of this aesthetic are overtly or implicitly sexual. Should these representations of child sexuality be allowed? With what limits?
These aren’t, by a long shot, the only interesting or important edge cases that Internet companies have to deal with when moderating sexual content. But they are examples from which we intend that the meeting can jointly develop an approach towards thinking about such questions in a larger, more inclusive context.
We will never stem the tide of child sexual abuse through a regime of centralized control and censorship. We can only do so by providing the entire community with the information and support needed to prevent acts of abuse being committed to begin with. And this means not automatically acceding to the dictates of government, or the demands of a few wealthy and well-connected nonprofits.
FOSTA is proof that pitting the government against the Internet industry is an approach that is doomed to fail, and to harm innocent people. Multi-stakeholder collaboration is a better approach, that will empower Internet platforms of all sizes to make better, more evidence-based decisions on sexual content moderation that protect children, while also respecting the rights and freedoms of all. And our May meeting is the first step towards making it a reality.