In a blog post yesterday in which Tumblr announced that it would ban adult content from the site come December 17, it took pains to point out “something that should not be confused with today’s policy change: posting anything that is harmful to minors, including child pornography, is abhorrent and has no place in our community.”
But Tumblr’s attempt to distance its new adult content ban from its fight against child pornography only goes to underscore how closely the two are actually linked. After all, Tumblr’s removal from the Apple App Store due to its failure to control child pornography on its platform had been reported a mere two weeks earlier. A ban on all nudity and sexual content was its easiest path forward to having the app restored—though certainly not the best path forward, given the ban’s impacts on those using the site to post lawful adult content.
Tumblr knew about its child sexual abuse problem
Tumblr’s removal from the App Store did not come without warning. For years, its own users had called Tumblr a “hellsite” due to its problems such as bullying, extremism, sexual grooming of minors, and the exchange of child pornography—or, to use a better term, unlawful images of minors. But it was also a place for creativity, friendship, and learning for its users. For many, it was something of a microcosm of the Internet itself.
In its reaction to the App Store ban, Tumblr blamed a shared industry shared child pornography database for its failure to eliminate images from its website. We called Tumblr out for that statement because much of its problem with unlawful images came about from its youngest users posting naked images of themselves. Although it’s a fact not often spoken, this is the most common category of unlawful images of minors, both on and off Tumblr.
As Tumblr well knows, the shared industry database that it blamed for its problems wasn’t ever intended to catch such images that are being newly uploaded by users, and that haven’t already been reviewed by a human being and determined to be unlawful. In fact we have recently expressed concerns about the effectiveness of new experimental systems from Facebook and Google that do claim to be able to make this determination algorithmically.
Although technology continues to advance, the current best technical state of the art in weeding out unlawful images of minors that haven’t been observed before remains the human eye, and the best way of making sure that such images are removed quickly is by enlisting the help of the site’s users. Therefore, we have recommended making it easier for them to report offending content, rather than merely blocking it.
This is one area in which Tumblr falls down. Unlike Twitter for example, which integrates a reporting form into every post (as shown here), Tumblr expects its users to know that they have to access a separate page on its website, copy and paste the URL of the post that they wish to report, and… well to be honest, most users won’t even get that far. As a result, unlawful material was known to persist in Tumblr’s darkest corners for days or weeks, when similar content is removed from other platforms within minutes or hours.
Tumblr failed to adopt primary prevention best practices
Technology alone will never eliminate child sexual abuse. Therefore we also recommend that users who will inevitably upload images of themselves while underage, and those who will inevitably access such images, receive education and resources about the serious consequences of doing so. Many users, particularly young users, simply don’t know that what they are doing is against the law, and don’t fully appreciate the likelihood that they may come to regret sharing or viewing such images online.
The kind of education and resourcing needed to address this problem at its root is an integral part of what is called primary prevention. Importantly, this can’t be the platform’s sole responsibility, but is a shared responsibility that falls more heavily on parents, teachers, caregivers, and at some level every adult who comes into contact with a child. But although we can’t place full responsibility on platforms, we can at least hope that they won’t undermine professionals who are using the platform for this kind of primary prevention mission.
And this is another area in which Tumblr, unfortunately, fell down in its failure to control child sexual abuse on its platform. It failed to invest in its trust and safety team, or at least failed to give them the leadership or the authority to make the right decisions to protect children. Instead, Tumblr’s decisions were driven first and foremost by a desire to protect its brand from stigma. Months before the latest adult content ban, this resulted in it terminating accounts devoted to child sexual abuse prevention, while allowing abusers to go untouched.
Naïvely, Tumblr assumed that it could deal with its sexual abuse problem (or at least, show that it was doing something) by excising pedophiles from its website. But it chose to focus its attention on those who openly proclaimed themselves to be pedophiles. Not those who were actually engaging in trading unlawful images of minors or sexual grooming on the website. Surprisingly, these were generally two separate groups.
Most of those who openly admitted to being attracted to minors on Tumblr were those who recognized that they had a problem, and were trying to find help for it from peers and professionals on the site (including a member of our Advisory Council, whose account Tumblr terminated). Most of those posting unlawful images, on the other hand, didn’t identify as pedophiles at all, and many of them weren’t—they were Tumblr’s own underage users, and those who opportunistically groomed and exploited them.
How Tumblr could have done better
Tumblr wouldn’t have fallen into this trap if only it had listened to child sexual abuse prevention experts, who could have told them all of this. But when we wrote to Tumblr’s parent company Oath on September 2 to request “a substantive and confidential discussion with you about the fight against child exploitation,” we received no reply. Our other attempts to engage with Tumblr on these issues have also been rebuffed, even as companies larger than Tumblr have engaged with us productively and in good faith.
And although it may seem like hubris for such a new organization as Prostasia Foundation, we think that our concerns deserve a response. Why? Because of who we are. Prostasia Foundation’s Board and expert advisors come from all walks of life, and unlike any other child protection organization, we endeavor to include representatives from all of the sectors that are most deeply harmed when the fight against child sexual abuse misfires against those who are innocent.
We include mental health professionals; those whose research and clinical practice could be vital to unlocking solutions to child sexual abuse, but who come under attack whenever their work challenges preconceived stereotypes about how abuse happens. We include survivors of child sexual abuse, who are also currently stigmatized and isolated by a society that treats them as complicit in their own abuse. We include human rights and criminal justice experts, who defend due process and the rule of law for the protection of all. We include sex workers, who continue to bear the brunt of the law FOSTA, that we hope to help overturn. And yes, we also include Tumblr users.
Tumblr is a hellsite today, and it’s only going to become more of one, as enforcement of the policy mires in definitional debates over female presenting nipples. But its users deserve better than to be treated in the way that Tumblr has treated them. Due to its failure to engage with experts, its policies on harm to minors are ambiguous, unscientific, and are applied in an arbitrary and inconsistent way. Due to Tumblr’s failure to abide by best practice standards of transparency and accountability in content moderation, such as the Santa Clara principles, hundreds of Tumblr users have been left without recourse in response to its application of these flawed policies.
What other companies have learned by engaging with experts has informed their approach to dealing with child sexual abuse online. Twitter has problems of its own, and has also been described as a “hellsite” by some. But in general it has done a far better job than Tumblr of quickly removing unlawful images of children and terminating the accounts of those who trade in them, while allowing free speech about child sexual abuse prevention, and also continuing to allow most lawful adult content.
Tumblr may think its adult content ban is the simplest solution to its child sexual abuse problem, but it isn’t. Containing child sexual abuse on the website was a relatively defined problem, and one at which Tumblr failed. Containing all adult content on the site will be orders of magnitude more difficult again, and Tumblr has given no evidence that is up to the task. As such, the brunt of Tumblr’s failure at managing its platform will be borne by artists, fans, the LGBTQ+ community, sex educators, sex workers, survivors of child sexual abuse, therapists, researchers, and journalists.
We believe that Internet platforms can do better. That’s why next May, Prostasia Foundation will be bringing together Internet platforms in a roundtable, expert-led discussion about how they can reduce child sexual abuse in a way that is informed by scientific evidence, and that upholds the human rights of the minority groups who are most affected by censorship. If Tumblr still exists by then, its representatives will be welcome to attend.