CSAM deterrence

The Prostasia Foundation is committed to eliminating abusive content from the internet and preventing people from viewing it whenever possible. Many groups pursue these goals by working with law enforcement and implementing technology-based measures. However, research suggests that there is a third, potentially more effective, approach: deterrence.

Child sexual abuse material (CSAM) refers to images and videos of one or more children being sexually abused or exploited. In the legal world, it is known as child pornography, however, this term is not ideal, as “pornography” wrongly implies that consent was given.

CSAM adds to the trauma that abuse survivors grapple with by forcing survivors to come to terms with the fact images or videos of their abuse exist on the internet, available for anyone to watch. In some cases, such as when a child is manipulated into sending explicit images of themselves to an abuser, victims may blame themselves for the existence of this content, further adding to the shame experienced by many survivors.

Unlike legal and technological approaches, which typically only go into effect after CSAM has been viewed or shared, deterrence is aimed at preventing the initial viewing or sharing from ever occurring. This limits the spread of abusive content and often encourages individuals seeking such content to get help, which can prevent future harmful behavior.

Thanks to a generous grant from Google, Prostasia has developed a set of search ads that can appear when someone searches for a known CSAM-related term. When clicked, these ads direct the would-be CSAM viewer to our Get Help page, which contains instructions on reporting illegal content and lists resources that can help them stop seeking out harmful content. At the time of writing, we’ve redirected over 200,000 away from searches for harmful and illegal content.

In late 2022, we added a new series of ads aimed at helping people who encounter illegal content by accident report it to law enforcement. We’re still in the early stages of evaluating this secondary approach, but we hope it can further reduce the spread of CSAM online by increasing the rate at which reports are filed. These new ads will show up when people enter search terms indicating they want to report CSAM.

In addition to running advertisements, we also work with content platforms and third-party developers to build effective deterrence strategies for their services. These can include popups that display when users search for known CSAM-related terms, auto-disabling of features that could be used in an abusive manner on content involving minors, and filters that prevent potentially abusive content from being publicly available until it can be reviewed by human moderators.

Of course, deterrence alone will not solve the problem of child sexual abuse material. We continue to work with law enforcement to promote the removal of illegal content through methods that do not infringe on human rights. We are also involved in the development of technological solutions, such as tools to identify and flag suspected CSAM images for review by human moderators.

If you have encountered illegal content involving minors online, you can learn how to report it here.

Donate to support this work

Help others get help

Eliminating child abuse images without relying on police and censorship


$
raised
of 2,000.00 $
97%
  • $10
  • $50
  • $200
  • Custom