In 2022, the Prostasia Foundation took a step back from public-facing activism. Doing so allowed us to regroup after our unprecedented success in starting a nationwide conversation about child protection. The move was also a response to the mainstreaming of QAnon-esque misinformation about abuse prevention and the anti-LGBTQ “groomer” smear, which contributed to harassment and smear campaigns against our staff and experts in the field.
From an outside perspective, you may have seen this change reflected in the slower publishing rate on both our blog and newsletter. Projects like The Prostasia Conversations were put on hold, and other human rights organizations took the lead in fighting against new censorship bills like EARNIT and KOSA. We also stopped organizing public events during the year, and only participated in one.
Behind the scenes, this time was an opportunity for us to look toward the future. We laid the groundwork for MAP Support Club to undergo a professional evaluation and took the first steps in developing a training program for its staff. Our own team welcomed new members and renewed our focus on growing child safety issues. We also reviewed our ongoing projects to identify growth opportunities and ensure they reflected our unwavering anti-abuse stance.
Among the projects that underwent this review was our CSAM Deterrence Project, the centerpiece of our efforts to fight the spread of child sexual abuse material online. The initiative uses advertisements in search engine results to stop people from seeking out harmful content and redirect them to professional support. As of February 2023, over 217,000 people have clicked our ads.
While the project looked incredibly effective on paper, our review identified several opportunities for improvement. Most notably, the targeting of our ads was too broad and they were sending users to outdated pages with limited resources. Luckily, we were already working on a new page with information on reporting CSAM and finding support to stop viewing illegal content, so we decided to direct individuals searching for CSAM there going forward.
Around the end of January 2022, we launched the updated ads, initially running them alongside our existing ads to allow comparison. Despite a somewhat underwhelming launch, the outlook improved over time. The number of clicks began increasing in March and spiked in April, and the cost slowly decreased after that. Combined with the increased efficiency from better targeting, they improved the project’s overall effectiveness and flexibility. We slowly discontinued the old ads over the following months, starting with the least effective ones to minimize the impact.
Near the end of the year, we discovered that the new ads were showing up for searches related to reporting CSAM. We saw this as another opportunity to stop the spread of harmful content and began developing another ad campaign to help these individuals find guidance. This second set of ads proved incredibly effective after launching in November 2022, receiving more clicks at a lower cost.
Today, we are running both new sets of ads, with plans to continue expanding their reach. In addition, we are also developing a CSAM scanning plugin for Discourse and continuing to offer consulting services to online platforms that want to implement abuse prevention initiatives. There’s a long path to eliminating CSAM from the internet, but we’re committed to the fight. Together, it’s one we can and will win. |