Prostasia Newsletter #35—June 2021 View online
Prostasia Foundation Protecting children by upholding the rights and freedoms of all
Pushing censorship over the borderline

When online platforms and service providers introduce new restrictions on sexual content, those restrictions tend to be both persistent and cumulative: an instance of the so-called ratchet effect. Once eBay banned sexual content in May 2021, Pornhub’s purged 80% of its content in December 2020, and Tumblr banned sexual content altogether, these weren’t changes that we should expect to see reversed any time soon. If anything, they lay the groundwork for further restrictions, frequently cloaked in “think of the children” rhetoric.

Like the movement of territorial borders, the ratcheting up of restrictions on sexual content online proceeds at an uneven pace. Sometimes it affects large swathes of content and is driven by a high-level policy change (as in eBay’s and Tumblr’s cases). At other times it advances slowly, measured by the steady deplatforming of individual creators at the whim of low-level content moderators. In either case, we should not allow the renegotiation of the borderline between acceptable and unacceptable content to pass unnoticed.

What is borderline content?

“Borderline content” is a term that Internet platforms use to describe speech that toes the line between acceptable and unacceptable. It doesn’t just apply to sexual content, but also to content that tests the borderline of platform rules on other topics such as hate speech and terrorism. Sometimes, platforms do not censor borderline content completely, but employ techniques such as shadow banning to ensure that it cannot be surfaced through searches, recommendations, or autocomplete suggestions. Jillian York talks more about borderline content in her book Silicon Values, which is reviewed later in this newsletter.

In the case of child exploitation, borderline content is content that isn’t illegal, but that is considered to push against platform policies against the exploitation or sexualization of minors. Deliberately fuzzy platform terms of service on these topics (such as Reddit’s injunction against “sexual or suggestive content involving minors or someone who appears to be a minor”) give content moderators significant latitude to exercise their own subjective judgment on what falls on either side of the borderline.

Platforms also face significant external pressure to shift the line towards greater restrictions on speech. For example, the Canadian Center for Child Protection argues that it and other “trusted hotlines” are entitled to apply a set of much looser and broader standards—such as whether an image “more likely than not” represents a child—in directing Internet platforms to remove borderline content. In light of the Canadian Center’s role in having cartoon images and image search engines reported as child pornography, the prospect of it being directly empowered to adjudicate cases of borderline content is a chilling one.

Applying a flexible standard to borderline content

Prostasia and our partners dealt with borderline child exploitation content extensively in our May 2019 Multi-Stakeholder Dialogue on Internet Platforms, Sexual Content, & Child Protection, which included a number of real-life case studies as examples. In some cases, our participants suggested that content often considered as borderline was worthy of stronger protection (such as ageplay and DD/lg themed fetish content), while in other cases there was concern that content creators were “gaming” content rules and that more active moderation was warranted (for example, some exploitative faux-nudist content).

In the outcomes of the event, it was recognized that the difference between content falling on one side or the other of the borderline often depends on the context in which it appears. As such, there is merit in adopting a flexible standard when deciding how such content should be treated, rather than imposing blanket censorship rules. For example, it may be proper for children to be able to post videos of their own exercise routines, while being improper and exploitative for YouTube to automatically collect those videos into a recommendation playlist or to allow them to gather inappropriate comments.

The borderline between art and exploitation

The latest example of censorship creep at the borderline between legitimate content and child exploitation has been the deplatforming in May 2021 of several art websites, the best known of which was Pigtails in Paint, which explored representations of girlhood in art and media. According to our source, UK police put an ultimatum to the sites’ web host requiring it to discontinue providing service, even though no court order was provided.

The content of Pigtails was a good example of borderline content. Among many other artworks depicting girl children, it occasionally included depictions of child nudity, mostly illustrated but and in some cases photographic. It is easy to see how collections of such images, if presented in a sexualized context, could be considered as being exploitative. The censored websites didn’t present them in such a context, and included commentary and scholarly discussion, but nevertheless attracted a stream of complaints. In 2017, Pigtails’ administrator wrote:

Simply because some people are uncomfortable with the frankness in which sensitive subjects are handled here does not mean that we are witnessing examples of abuse.  In fact, real abuse and unethical exploitation of children is exposed and discussed whenever appropriate. Pigtails has been reported to law enforcement and watchdog authorities many times and, except those with a specific biased agenda, they have given this site a clean bill of health.

In a previous newsletter, we reported on the single occasion on which content hosted on Pigtails was reported as child pornography and removed. Ironically, that item was not a photograph, but a drawing: a page of the Ignatz Award-nominated semi-autobiographical comic Daddy’s Girl by Debbie Drechsler, depicting her own harrowing story of incestual child sexual abuse. So it always goes with censorship.

The solution to over-censorship

Because borderline content isn’t illegal and because platforms are generally private companies that are free to choose their own users and customers, they have free rein to deplatform marginalized sexual speech, typically with little to no public oversight or transparency. But they can’t pretend that these decisions don’t have side effects.

Loosely defining child exploitation isn’t a sustainable solution for platforms seeking to address the actual harms inherent in some borderline content. The looser the definition, the more child sexual abuse survivors, LGBTQ+ people, sex workers, artists, and fans will be wrongly deplatformed—with no benefit to children.

Through our May 2019 Multi-Stakeholder Dialogue, Prostasia and its partners developed a set of Best Practice Principles for Sexual Content Moderation and Child Protection that online platforms could use to guide their decisions about borderline content. They begin by drawing a simple distinction between content that directly harms a child (which is generally coincident with the legal definition of child pornography, and should be immediately removed), and content that is merely supposed to carry indirect harms, which must not be treated in the same way:

Sexual content should be restricted where it causes direct harm to a child. Indirect harms should not be the basis for blanket content restriction policies unless those harms are substantiated by evidence, and adequate measures are taken to avoid human rights infringements.

Conclusion

Child exploitation has always been the lever that conservative groups use to ratchet up the censorship of sexual content in general, and there are no signs of this changing any time soon. Prostasia’s advocacy has not so far induced any large corporate platforms to reverse course and treat borderline content with the mindfulness and nuance that it deserves.

However, there is hope. Smaller platforms, run by and for the same marginalized groups whose sexual speech and expression is most directly impacted by censorship, are rising to meet the challenge of providing safe online spaces for sexual expression that don’t tolerate image-based abuse or other forms of direct child exploitation. Continue reading this newsletter to discover more about some of these platforms and how they are navigating an increasingly hostile environment for freedom of sexual expression online.

Recent blog posts
Child trafficking narratives are misleading
When most people hear “human trafficking,” they think of kidnapped young cis, mostly white girls sold into enforced sexual slavery by criminal gangs composed mostly of people of color. Lurid…
Read more...
Reducing self-stigmatization in MAPs can reduce risks of offending
Minor Attracted Persons, or MAPs, are the target of social and legal censure; they are hated and despised. Often, they hate and despise themselves. Some hope that the stigma and…
Read more...
Fantasy is NOT abuse: advance access pass
Gain advance access to the recording of the livestreamed event, Fantasy is NOT abuse. Your contribution goes towards our $5000 fundraising goal, which will be split between The HEAL Project…
 
From: $0.00
View product
Prostasia lends a hand to ASAP

by Tabitha Abel

ASAP, the Association for Sexual Abuse Prevention, held its 5th one-day workshop in Clearwater, FL—and virtually, on Monday, May 3, 2021, with the help of Jeremy Malcolm, Executive Director of Prostasia Foundation. Collaboration in many forms between organizations with differing routes to similar goals contributes to each reaching their end goals. We were the grateful recipients of Jeremy’s IT skills, and of two excellent presentations provided for an audience of therapists, counselors, MAPs (Minor-Attracted Persons), and others, who attended the 8-hour workshop.

 

ASAP invited Prostasia and B4UACT to join them this year. Our keynote speaker was David Prescott, LICSW, whose virtual presentations drew from a long career in counseling NOMAPs, sex offenders and those affected by sexual abuse in its many forms. Psychology CEUs were available to attendees.

 

ASAP began in 2015 as a 501(c)3 organization designed to fill the void in connecting MAPs to enlightened, professional help, peer support, guidance and knowledge about dealing successfully with pedophilia, an unwanted attraction.

 

Gary Gibson, and his wife set up ASAP to decrease the risk of Child Sexual Abuse (CSA) by assisting MAPs, AKA pedophiles, particularly stigmatized, virtuous pedophiles (Non-Offending Minor-Attracted Persons, NOMAPs), who, without such support, may become depressed and suicidal, or eventually act on that attraction and molest a child. Suicide notes do not tell their whole truth. With the help of other professionals, ASAP developed a list of around 500- therapists who understood pedophilia, available to help diffuse the ignorance and hatred that abounds towards this population. ASAP continues to advance the truth about pedophilia by participating in research studies and giving radio and TV interviews, and by writing articles and letters to promote their goal of decreasing CSA by educating mental health care providers and leaders in local and state government, educators, religious communities, law enforcement and others. ASAPs workshops are part of ASAPs feet-on-the-ground education program.

 

For people who have spent much of their life coping with this devastating attraction, secrecy, and the fear of being discovered, ASAP’s workshops bring relief and encouragement as they learn that help is on its way –slowly. VPs, virtuous pedophiles, learn that they can live a life free of acting on the attraction, and that they are not alone. ASAP, researchers and other organizations are working together to make a difference. The truth is that “pedophile” is not synonymous with “child molester”, and molesting a child is not inevitable for boys, girls, men or women who are sexually attracted to children. Help is coming.

 

ASAP’s goals are unchanged. 1) connect MAPs with professional help; 2) provide peer support to MAPs, 2) educate the public (including high school kids through sex-education classes before they act-out and molest younger kids) with the truths about pedophilia and 3) decrease CSA. ASAP’s work is challenging but in collaboration with Prostasia and other groups, we are encouraged, and our 24/7 helpline is making the world a safer place for children. In time, pedophilia will no longer be hidden under the carpet, but recognized as a treatable, mental health problem, and professional help will be available to those needing such support.

A conversation with Byrdy Lynn, part 2

Erin Gould continues his conversation with Byrdy Lynn (author of Through The Storm Of Early Trauma : Healing & Overcoming) about her triumphant survival of early childhood trauma.  In this episode, they discuss the witnessing of a murder at a young age, confronting racism in her high school, running away from home, and the sexual abuse from her childhood friend.

Review: Silicon Values
by Jillian York
Reviewed by Jeremy Malcolm

"What is the right balance between sexual freedom and the protection of children?" asks Jillian York in the concluding chapter of Silicon Values: The Future of Free Speech Under Surveillance Capitalism. In a society that views sexual expression and childrens' use of the Internet as antithetical, even asking this question invites suspicion.

But York's questions are reasonable and they demand answers. Silicon Values is a trenchant critique of how platform censorship is enabled by surveillance capitalism—the online business model built on the commodification of personal data for profit. In her investigation of this topic, York refuses to look away from difficult edge cases, where platform policies have disproportionate impacts on the human rights of marginalized groups.

A defining contribution of the book is York's searing analysis of how processes for the removal of abusive content at scale disproportionately affect "women, queer and trans people, and sex workers."As she points out, this "War on Sex" predates SESTA/FOSTA and is "the result of both well-meaning-but-misguided anti-sex trafficking activism and the conservative predilections of payment processors, corporate executives, and politicians."

York traces the origins of this moral panic to an early false study about the prevalence of pornography on the Internet, which in turn led to the passage of a landmark law on Internet platform liability. She documents how major Internet platforms have since ratcheted up their limitations on sexual content in response to this rising political tide of sex-negativity, pointing to examples such as Facebook's ban on breastfeeding photos (which led to protests in 2009 and an eventual partial relaxation of the policy), and Tumblr's banning of all sexual imagery following the passage of FOSTA in 2018. York writes:

At a time when attitudes towards sex work, transgender individuals, and other sexual minorities are by and large changing for the better, it is perhaps ironic that Silicon Valley's CEOs are so rapidly closing off the spaces where such communities have long gathered.

Among many examples of these policies misfiring on the most marginalized, she recounts how a Google algorithm designed to downrank of filter "toxic" comments rated the speech of drag queens as toxic, because of their use of slang terms and reclaimed slurs that were once used against their own community. "As it turns out," York writes, "the best way to get rid of undesirable content is to cast a wide net… no matter if a few dolphins get caught up in it."

She also identifies that these broad content restriction policies of Internet platforms haven't been of entirely endogenous origin; rather, she characterizes platforms as "watching, and waiting, for the public sentiment around any given controversy to gain enough momentum that they had to respond accordingly." This explains (to use an example not given by York) Twitter's actions against the anti-abuse MAP community, in which an initially science-led approach was undermined by growing public sentiment against that community.

Another factor driving the increasingly precarious state of sexual content online has been the growing power of "backdoor collaborations between governments and platforms." Evelyn Douek has described these arrangements as content cartels, and the Electronic Frontier Foundation (which employs York, and formerly this reviewer) has described them as shadow regulation. York is unsparing in her criticism of major platforms out for this complicity, writing that:

As Zuckerberg et al. have lined their pockets with income generated from the advertisers preying on the world's citizens, they have increasingly cozied up to power, with apparently little concern for how that power is derived.

However unlike some cyber-libertarians, York does not promote a laissez-faire approach to the protection of children online. "I am an advocate for free expression not because I believe that all speech is equally important, or that all speech is good," she writes, "but because I believe that power corrupts, and absolute power corrupts absolutely, which makes it nary impossible to trust an authority to censor effectively."

She acknowledges that the criminalization of child sexual abuse material is a measure that "all but the most depraved individual might agree with; it is also—despite that consensus—censorship" (a point that Prostasia Foundation has also made). She also asserts, probably correctly, that despite the sexual speech of marginalized groups being over-censored, "When it comes to child sexual abuse imagery, society is largely comfortable with the potential of over-moderation."

These impacts on minority groups are inevitable, argues York, when "the most marginalized members of society are rarely invited to the table to lend their views as policy is being created." Instead, the powerful and regressive interests behind censorship content cartels have embedded themselves into platforms' policy teams, as "policy hires increasingly come from government, law enforcement, or the policy teams of other corporations, which has created a revolving door through which only a certain subset of people can enter."

Although this paints a depressing picture—and York acknowledges that these are difficult problems without clear solutions—she holds out some hope that if major platforms continue to refuse to listen to their users, then new platforms that genuinely promote freedom of expression may rise to replace them, and that these may be "built and organized by BIPOC, trans folk, and sex workers." Some signs of this can already be seen, with the rise of platforms such as Tryst for and by sex workers, and Fanexus for and by queer fans.

The War on Sex (which takes its own chapter in Silicon Values) is only one of the themes that York explores in this book; she also covers platform censorship of disfavored political speech, hate speech, and other forms of challenging content. At no point does she over-simplify the difficulty of how speech on these topics can be allowed while maintaining a safe and inclusive online environment for all. "But these are questions that we as a society must debate," she urges, "rather than leaving their outcomes to corporations."

Silicon Values: The Future of Free Speech Under Surveillance Capitalism is recommended reading from an activist who has been covering platform moderation for longer than several of today's large platforms have existed. In an area that doesn't lend itself to simple solutions, York's insights lay an important foundation for ongoing intelligent discussion of the topic.

Can we count on your support?

We encourage you to help us further strengthen our ability to combat child sex abuse and reinforce our presence as pioneers in the child protection sphere. One of our partnering organizations recently championed Prostasia for our “outside of the box thinking” that has contributed to our success in preventing child sex abuse and upholding human rights for all. 

 

Prostasia invites you to support our efforts. You are able to donate to specific causes, such as our Get Help service or bringing transparency to the child protection sector, or you can make a general donation to Prostasia. Your contribution will help us continue to seek unique, innovative and evidence-based approaches. It’s your support that allows Prostasia to thrive!

 

Donate now

Note: Links to products in this newsletter may be affiliate links, which pay Prostasia Foundation a small commission on sales.

facebook twitter linkedin tumblr youtube instagram
Modify your subscription    |    View online
Prostasia Foundation
18 Bartol Street #995, San Francisco, CA 94133
EIN 82-4969920