Eliminating child abuse images

We are committed to eliminating unlawful sexual images of minors, also known as child sexual abuse material (CSAM), child sexual exploitation material (CSEM), or child pornography. But unlike other groups, we don’t believe that this can be done through law enforcement or technology alone. As a human problem, we have to take account of the humanity of its perpetrators and to offer a nuanced solution that addresses them in their complexity.

Technological tools and mental health support for potential perpetrators of image abuse are complementary interventions. Here’s are the three ways Prostasia is working on eliminating the use of abuse images: through technology, deterrence, and support for mental wellness.

Technology: image scanning for chat groups

We provide a unique open source solution for administrators of Rocketchat chat groups, to enable them to scan for and quarantine known unlawful images automatically. The GitHub repository for this software is available below.

Dark Mode

rocketchatcsam (this link opens in a new window) by prostasia (this link opens in a new window)

This RocketchatApp validates uploaded images against the Microsoft PhotoDNA cloud service and quarantines those identified as child abuse images (child pornography or CSEM).

Deterrence: our “CP campaign”

“CP” is the Internet’s colloquialism for the unlawful depictions of children that the law prosecutes as child pornography. Most people know that looking at these images and videos is both wrong and illegal. But some will seek it out anyway. Reaching out to them before they do so could prevent a crime and could save a child from being revictimized.

Unlawful images can be found on mainstream Internet platforms, but they are usually discovered and reported quickly, and Internet companies are improving the speed and accuracy with which they do this all the time. That’s why a large proportion of trading of unlawful images is instead done over encrypted chat or file-sharing services, or over encrypted network connections, that cannot be so easily intercepted or taken down.

One of those encrypted networks is the Tor network, which is used around the world by those wishing to access the Internet with greater privacy, for a variety of mostly legitimate reasons… but is also used by a few to share “CP”.

Banning privacy-enabling encryption technologies such as Tor, with their many important legitimate uses, is not the answer. But what might be the answer is to reach out to those who are bound for the Tor network with child abuse primary prevention messages, especially if there is also any indication that they are seeking unlawful images.

So far Prostasia has a rotating set of three Google search ads (some shown here), thanks to a generous in-kind donation from Google. But our objective is to do more, to include advertisements on privacy-focused search engines (like DuckDuckGo), privacy and security related product websites and products, and on websites hosted on the Tor network itself.

Support for mental wellness: MAP Support Club

Prostasia Foundation is the institutional partner of a peer-support group called MAP Support Chat, which is dedicated to providing a safe space for those who have unchosen sexual feelings that can lead towards the use of unlawful images of minors. MAP Support Club also provides access to professional support that can reduce offending behavior, and harm to self or others. Our Rocketchat CSAM scanning tool is utilized in MAP Support Club, for which it was originally developed.

Donate to support this work

Help eliminate child abuse images

Eliminating child abuse images without relying on police and censorship


$
raised
of 0.00 $
0%
  • $10
  • $20
  • $50
  • $100
  • Custom