From the Newsletter: Pushing censorship over the borderline

Editor’s Note: Prostasia is reprinting useful material from our newsletter for blog readers. This essay first ran in June 2021. To keep up to date on everything happening at Prostasia, you can sign up for the newsletter here.
_____

When online platforms and service providers introduce new restrictions on sexual content, those restrictions tend to be both persistent and cumulative: an instance of the so-called ratchet effect. Once eBay banned sexual content in May 2021, Pornhub’s purged 80% of its content in December 2020, and Tumblr banned sexual content altogether, these weren’t changes that we should expect to see reversed any time soon. If anything, they lay the groundwork for further restrictions, frequently cloaked in “think of the children” rhetoric.

Like the movement of territorial borders, the ratcheting up of restrictions on sexual content online proceeds at an uneven pace. Sometimes it affects large swathes of content and is driven by a high-level policy change (as in eBay’s and Tumblr’s cases). At other times it advances slowly, measured by the steady deplatforming of individual creators at the whim of low-level content moderators. In either case, we should not allow the renegotiation of the borderline between acceptable and unacceptable content to pass unnoticed.

What is borderline content?

“Borderline content” is a term that Internet platforms use to describe speech that toes the line between acceptable and unacceptable. It doesn’t just apply to sexual content, but also to content that tests the borderline of platform rules on other topics such as hate speech and terrorism. Sometimes, platforms do not censor borderline content completely, but employ techniques such as shadow banning to ensure that it cannot be surfaced through searches, recommendations, or autocomplete suggestions. Jillian York talks more about borderline content in her book Silicon Values, reviewed in our newsletter here.

In the case of child exploitation, borderline content is content that isn’t illegal, but that is considered to push against platform policies against the exploitation or sexualization of minors. Deliberately fuzzy platform terms of service on these topics (such as Reddit’s injunction against “sexual or suggestive content involving minors or someone who appears to be a minor”) give content moderators significant latitude to exercise their own subjective judgment on what falls on either side of the borderline.

Platforms also face significant external pressure to shift the line towards greater restrictions on speech. For example, the Canadian Center for Child Protection argues that it and other “trusted hotlines” are entitled to apply a set of much looser and broader standards—such as whether an image “more likely than not” represents a child—in directing Internet platforms to remove borderline content. In light of the Canadian Center’s role in having cartoon images and image search engines reported as child pornography, the prospect of it being directly empowered to adjudicate cases of borderline content is a chilling one.

Applying a flexible standard to borderline content

Prostasia and our partners dealt with borderline child exploitation content extensively in our May 2019 Multi-Stakeholder Dialogue on Internet Platforms, Sexual Content, & Child Protection, which included a number of real-life case studies as examples. In some cases, our participants suggested that content often considered as borderline was worthy of stronger protection (such as ageplay and DD/lg themed fetish content), while in other cases there was concern that content creators were “gaming” content rules and that more active moderation was warranted (for example, some exploitative faux-nudist content).

In the outcomes of the event, it was recognized that the difference between content falling on one side or the other of the borderline often depends on the context in which it appears. As such, there is merit in adopting a flexible standard when deciding how such content should be treated, rather than imposing blanket censorship rules. For example, it may be proper for children to be able to post videos of their own exercise routines, while being improper and exploitative for YouTube to automatically collect those videos into a recommendation playlist or to allow them to gather inappropriate comments.

The borderline between art and exploitation

The latest example of censorship creep at the borderline between legitimate content and child exploitation has been the deplatforming in May 2021 of several art websites, the best known of which was Pigtails in Paint, which explored representations of girlhood in art and media. According to our source, UK police put an ultimatum to the sites’ web host requiring it to discontinue providing service, even though no court order was provided.

The content of Pigtails was a good example of borderline content. Among many other artworks depicting girl children, it occasionally included depictions of child nudity, mostly illustrated but in some cases photographic. It is easy to see how collections of such images, if presented in a sexualized context, could be considered as being exploitative. The censored websites didn’t present them in such a context, and included commentary and scholarly discussion, but nevertheless attracted a stream of complaints. In 2017, Pigtails’ administrator wrote:

Simply because some people are uncomfortable with the frankness in which sensitive subjects are handled here does not mean that we are witnessing examples of abuse.  In fact, real abuse and unethical exploitation of children is exposed and discussed whenever appropriate. Pigtails has been reported to law enforcement and watchdog authorities many times and, except those with a specific biased agenda, they have given this site a clean bill of health.

In a previous newsletter, we reported on the single occasion on which content hosted on Pigtails was reported as child pornography and removed. Ironically, that item was not a photograph, but a drawing: a page of the Ignatz Award-nominated semi-autobiographical comic Daddy’s Girl by Debbie Drechsler, depicting her own harrowing story of incestual child sexual abuse. So it always goes with censorship.

The solution to over-censorship

Because borderline content isn’t illegal and because platforms are generally private companies that are free to choose their own users and customers, they have free rein to deplatform marginalized sexual speech, typically with little to no public oversight or transparency. But they can’t pretend that these decisions don’t have side effects.

Loosely defining child exploitation isn’t a sustainable solution for platforms seeking to address the actual harms inherent in some borderline content. The looser the definition, the more child sexual abuse survivors, LGBTQ+ people, sex workers, artists, and fans will be wrongly deplatformed—with no benefit to children.

Through our May 2019 Multi-Stakeholder Dialogue, Prostasia and its partners developed a set of Best Practice Principles for Sexual Content Moderation and Child Protection that online platforms could use to guide their decisions about borderline content. They begin by drawing a simple distinction between content that directly harms a child (which is generally coincident with the legal definition of child pornography, and should be immediately removed), and content that is merely supposed to carry indirect harms, which must not be treated in the same way:

Sexual content should be restricted where it causes direct harm to a child. Indirect harms should not be the basis for blanket content restriction policies unless those harms are substantiated by evidence, and adequate measures are taken to avoid human rights infringements.

Conclusion

Child exploitation has always been the lever that conservative groups use to ratchet up the censorship of sexual content in general, and there are no signs of this changing any time soon. Prostasia’s advocacy has not so far induced any large corporate platforms to reverse course and treat borderline content with the mindfulness and nuance that it deserves.

However, there is hope. Smaller platforms, run by and for the same marginalized groups whose sexual speech and expression is most directly impacted by censorship, are rising to meet the challenge of providing safe online spaces for sexual expression that don’t tolerate image-based abuse or other forms of direct child exploitation.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.