Prostasia Foundation Protecting children by upholding the rights and freedoms of all
The future of Europe’s fight against CSA

Like much of how the Internet is governed, the way we detect and remove child abuse material online began as an ad hoc set of private practices. In 1996, an early online child protection society posted to the Usenet newsgroup (yes, such a thing really existed) to try to discourage people from posting such “erotica”, on the assumption that the Internet couldn’t be censored. But that was never quite true: eventually, Internet providers decided that allowing such a newsgroup to be hosted on their servers wasn’t such a good idea, and they blacklisted this and similar newsgroups. But by then, the World Wide Web had taken over in popularity from Usenet, and the game began again.

Very quickly, governments woke up and decided that the removal of images of child abuse and child nudity from the Internet was something they needed to be involved in. In the United States, lawmakers looked to an existing government-linked nonprofit, the National Center for Missing and Exploited Children (NCMEC). NCMEC had been formed in 1984 in the early stages of what became an unprecedented media blitz on the issue of child abduction and sex trafficking, fueled by notorious cases such as that of Adam Walsh, who also gave his name to the national sex offense registry law.

In 1998, in the middle of the Internet’s explosive early growth phase, NCMEC created a Cybertipline as a place for Internet users to report incidents of suspected child sexual exploitation. Over the next decade, NCMEC (in partnership with other U.S. federal agencies such as the FBI and ICE), established a network of alliances with foreign law enforcement agencies, and began forwarding them any Cybertipline reports that seemed to involve their countries.

In 2008, then Senator Joe Biden sponsored the PROTECT Our Children Act, which made reporting suspected child pornography to the Cybertipline compulsory for Internet platforms. However there was no mandate for them to proactively scan their servers for such material; they simply reported it if and when they became aware of it.

Hash scanning to the rescue

This reliance on user-reporting quickly became a bottleneck. Even after platforms had removed illegal material from their servers, it would reappear—and they would have to wait for another user report to find it again. Very soon, large platforms began digitally “fingerprinting” unlawful images when reporting them the first time, so that they could be automatically removed if they reappeared. But in the cat and mouse game between platforms and abusers, the latter quickly learned that making minor changes to an image could defeat such automatic scanning.

A breakthrough was made in 2009, when Microsoft Research and Hany Farid, professor at Dartmouth College, developed a new tool called PhotoDNA, intended for use in preventing known unlawful images from being uploaded even if small changes to the images were made. Internet platforms began sharing PhotoDNA-compatible fingerprints (or hashes) of images that were reported to NCMEC, until eventually NCMEC took over this function also, becoming the maintainer of a shared database of hashes of unlawful images in 2014.

What does any of this have to do with Europe? Well, nothing much—and that has become a problem for Europe. Although some platforms that operated in Europe participated in this NCMEC-centered scheme, the unvetted reports that NCMEC was forwarding back to European authorities were of an extremely low quality, with up to 90% of them representing innocent, lawful content such as cartoons and family photos. (NCMEC has disputed this, claiming that its reports are about 63% accurate—still hardly an inspiring figure.) Responsibility for sifting the wheat from the chaff fell to European law enforcement authorities or abuse reporting hotlines. These hotlines organized in 1999 into a European Commission funded network, INHOPE, which NCMEC also later joined.

One of the key national European reporting hotlines was Britain’s Internet Watch Foundation (IWF), which had originally been formed to tackle the problem on Usenet, and had begun collecting image hashes in 2015. Unlike NCMEC which was a creation of statute, the IWF had been formed by Internet platforms themselves, many of whom used its tightly-curated hash lists in preference to those of NCMEC. In 2019, NCMEC and the IWF began sharing their hash databases with each other.

Rise of the machines

Despite its evolution, this regime for the filtering of unlawful sexual images of minors remained an ad hoc and largely private arrangement—but governments wanted more control. In November 2018, at the height of a campaign against “Wild West” Internet platforms led by UK tabloids and government-linked child protection organization the NSPCC, the UK Home Office co-sponsored a tech industry hackathon with the aim of developing artificial intelligence (AI) based surveillance tools for the detection of child grooming which could be “licensed for free to resource-constrained smaller and medium-sized tech companies.” Prostasia Foundation representatives attended the formal parts of the event, but were excluded from the hackathon itself. The eventual custodian of the completed tool, Thorn, also refused to license it to us for use in the peer support discussion group that we host.

Meanwhile, other AI surveillance tools were under development with the aim of responding to government demands that not only existing images of child sexual abuse, but also never-before-seen images, could be automatically detected and eliminated from Internet platforms. During 2018 both Google and Facebook began using proprietary tools that purported to be able to identify unlawful images of children. Despite concerns about their accuracy and about privacy implications around the use of such tools, as well as a lack of transparency and accountability around their operation, these experimental tools were quietly moved into production. Google also licensed its tool to other platforms (though again, refused to license it to Prostasia).

The ePrivacy Regulation reality check

This extension of private surveillance in the name of child protection came to a screeching end in December 2020, when Europe’s ePrivacy Directive came into effect. As it turns out, the mass and indiscriminate use of surveillance tools by Internet platforms against their users, in the absence of any prior suspicion that they have done anything wrong, infringes their fundamental human right of privacy. This placed not only the use of experimental AI surveillance tools, but even the much more accurate and well-tested PhotoDNA scanning, under a legal cloud.

The groundwork for a long-term solution to this dilemma had already been planned, in the form of a strategy for a more effective fight against child sexual abuse that the European Commission released in July 2020. As part of this strategy, the Commission planned to establish a new regime for the reporting and removal of unlawful sexual images of minors by Internet platforms, which would build in the necessary privacy protections and democratic safeguards that that ad hoc private regime lacked.

Anticipating that this would not be ready in time for the commencement of the ePrivacy Directive, in September 2020 the European Commission proposed a temporary derogation from the ePrivacy Directive that would allow the continuation of scanning for child sexual abuse online until 2025 at the latest. Among the few safeguards included were that this derogation would be “limited to technologies regularly used” for this purpose, and that such technologies should “limit the error rate of false positives to the maximum extent possible.”

However although tech companies also agreed to this proposal, the European Parliament’s Civil Liberties, Justice and Home Affairs (LIBE) committee found its safeguards to be insufficient. The committee proposed a compromise with somewhat stronger safeguards, but which would still allow the AI surveillance tools to be used to scan both private messages and photos, provided that they were reviewed by a human being before being forwarded to law enforcement authorities.

As this compromise was unacceptable to the Council and the Commission, the ePrivacy Directive took effect with no temporary derogation in place. Facebook immediately ceased scanning for unlawful images in its messaging services for European users—although other tech platforms including Google and Microsoft continued to use them in the hope that the temporary derogation would still be agreed shortly.

The future

In February 2021, while negotiations over the temporary derogation continue, the European Commission opened a consultation on its long term strategy. One of the main purposes of the consultation is to gather feedback on plans to establish a new legal regime under which Internet platforms are required (or, perhaps, voluntarily encouraged) to detect known child sexual abuse material (and perhaps, previously unknown material and suspected grooming) and to report it to public authorities.

Although notionally independent from the temporary derogation negotiations, the reality is that there will be enormous pressure for whatever is agreed as a temporary measure to be grandfathered in to the final legislative scheme. As things stand, two groups in the European Parliament are all that stand in the way of legalizing the scanning of private chats, emails and photos using unproven artificial intelligence algorithms. It is to be remembered that these AI tools have been in use for less than three years, and were only adopted under enormous political pressure for tech companies to “solve” the problem of child sexual abuse—a task which, even with the best of intentions, they are simply incapable of performing.

Allowing this to happen would be an exercise in child protection theater, and a disaster for civil liberties. Yes, it is important to establish a new legal regime that accommodates the voluntary scanning of uploaded content for known unlawful sexual images of minors, using well-tested tools such as PhotoDNA. But this should not be taken as an opportunity to also legalize the use of intrusive and untested artificial intelligence algorithms that provide no demonstrated benefit to child safety.


In our single-minded focus on surveillance and censorship as solutions to the problem of child sexual abuse, we have finally hit a wall: the fundamental human rights that protect us all. If we really mean to protect children from abuse, pushing against that wall isn’t the answer. Instead, we need to broaden our approach, and consider how investing in prevention could hold the answer to a longer term, sustainable reduction in child sexual abuse, online and offline.

Thankfully the European Commission, with advice from experts in the field, has broadened the scope of its ongoing consultation beyond the establishment of a legal regime for the reporting and removal of abuse images, to also include the possible creation of a European centre to prevent and counter child sexual abuse, which would provide holistic support to Member States in the fight against child sexual abuse. This center could help support research into what motivates individuals to become offenders, evaluate the effectiveness of prevention programs, and promote communication and exchange of best practices between practitioners. Having advocated for such an approach since our formation, we will be expressing our support for it in our response to the consultation.

Prostasia Foundation will also be holding a free webinar on March 15 with Member of the European Parliament Dr Patrick Breyer, and clinical psychologist Crystal Mundy, to discuss all angles of the future of the fight against child sexual abuse in Europe, and to provide participants with the background information they need to provide fully informed and comprehensive responses to the Commission’s consultation.

Register now
Introducing our new staff

Prostasia's staff more than doubled in size this month, as six new members joined our rapidly growing team. 

Rebecca Reeves, our new Development Officer, has a background in fundraising, grant writing and community outreach. She is a graduate of Northeastern University where she received her Master’s in Science in Security and Resilience with a focus in Counterterrorism. She is passionate about human rights having spent time working with anti-human trafficking nonprofits supporting survivors and their rehabilitation.

Erin Gould, our new Video Producer, was born in New York, grew up in Saudi Arabia and has lived and traveled all over the world. He currently resides in Alameda, CA. He is the head of Minimum Wage Entertainment, a production company that has created content for such clients as TheatreWorks Silicon Valley, Benefit Cosmetics, Town Hall Theatre, n5MD Records, KSD Casting, 42nd Street Moon, Bay Area Musicals and several others. MW Entertainment is also responsible for the critically acclaimed web series CASTERS, now available in a remastered format on Youtube.

Kevin Ayram, our first Consulting Psychologist, is a psychologist graduated from Universidad de Buenos Aires. Currently he is studuing a postgraduate degree in cognitive-behavioral treatments for people with depression and anxiety disorders. He is also training in clinical sexology. Kevin says:

As a society, we all want child sexual abuse to stop, but I believe that most decisions made towards this goal are not effective enough due to lack of consideration of scientific research by policymakers. In this way, what excites me the most is spreading a message to society about the importance of implementing policies based on scientific evidence to achieve the objective that we as a society so desire.

Xaverine Ndikumagenge is a team leader with expertise in campaign and project management, with 15 years of experience in managing diverse projects in research, advocacy and campaign on the African continent. In addition to her work with Prostasia Foundation she is the African Region Director for the Global Alliance for Legal Aid (GALA), a non-profit public interest advocacy organization working on issues such as over-indebtedness and incarceration of individuals in default. Previously, Xaverine was Regional Networker and Team Leader for the Africa Hub of Consumers International, a global voice for consumer justice and protection with 220 members in 100 countries, 54 of which are based in Africa. Xaverine has an MBA from the Management College of South Africa and a Bachelor of Science in Agriculture from the University of Illinois.

Cierra Zimmerman is our our new Social Media intern. Cierra is an advocate for human rights and feels passionately about helping the LGBTQ+ and SW community. She has a history of working with children and helping them grow, as she was previously a teacher’s assistant at a dance studio. You'll see Cierra's work on our new TikTok account, which we are developing as our latest social media platform.

Finally (for now!), Steph ElHaddad will be our new Grants Officer, for a new microgrants program that we'll be announcing shortly. Steph is a Lebanon-based trans and political activist who focuses his energy on issues related to body, gender, and sexual expression. Steph’s work extends to the South-West Asian and North African region. Steph is interested in engaging with queer groups, activist, and knowledge productions spaces in the global South. Steph has had several opportunities to work on topics of institutional patriarchy, sexism, and racism; political debates; and crime analysis, particularly gender-based crime. Currently, Steph is undertaking postgraduate studies in Criminology at the University of Birkbeck, London.

And that's not all... we're continuing to grow, and will be opening applications for our first Activist position soon. Prostasia Foundation has never been stronger, and it's all thanks to the support of our donors, members, and philanthropic partners. Thank you!

Recent blog posts
Targeting child abuse image laws to protect children
Effective laws against child abuse images are critically important in deterring the abuse itself and reducing the ongoing trauma suffered by its victims. In order to serve these objectives, legislatures…
What I learned as a MAP's partner
There’s a lot about minor attraction that you don’t know. I (who wish to remain anonymous) feel pretty confident about that because, until recently, there was a lot about minor…
Review: Girl Lost: A Hollywood Story

Reviewed by Meagan Ingerman

I dunno guys. I. Just. Don’t. Know. 

So, right, Girl Lost: A Hollywood Story, is a movie I watched so you don’t have to. This whole thing may be pretty spoiler laden but that is partially because material is ultimately formulaic.

In this movie women are predators or victims and in most cases, both. Men are gross, but kinda left blameless because they’re just paying for an experience. It’s not their fault the experience is being sold. There are things about this cycle of abuse that ring true. It is not unheard of for a person who has been trafficked by someone they fear, to end up helping that person to traffick new people. I’m not sure how much else I’m willing to suspend disbelief for. Perhaps the most human thing about the movie is that no one gets to be the savior and no one wins. 

There is a fine line with this movie. If you think of it as just entertainment, there is a lot of abuse to work around. If you think of it just as an indictment of trafficking, you miss the nuance of what sex trafficking, especially of minors, really looks like. 

My first and enduring impression of the movie is that it does not feel like sex workers or sexual assault victims were consulted in the making of this movie.There is very little information to be found about director Robin Bain but her IMDB bio includes this, “The film is part of Bain's continued effort to bring awareness to the pitfalls of the sex industry in Los Angeles.” 

So, if that snippet is to be believed, then we’re off on the wrong foot already because sex work is immediately painted as bad. The movie, in large part, is about how doing anything in the sex industry is bad and you will end up unhappy. A main point of the movie should be that a possibly underage teen is being used for sex and to make money that is then kept from her. Instead, aside from a few snide/joking remarks about “pervs” and “pedophiles” being interested in her, for the most part, no big deal is made about the fact that minors can’t consent and all the footage of her being used for sex amounts to possessing and distributing child porn. 

Another thing to be dealt with here, with the issue of child sexual abuse and trafficking, is that the stats don’t really support the idea that it is little white girls from the suburbs with dismissive parents who are being trafficked. Putting a conventionally attractive white girl on screen garners sympathy for that character with very little effort. It also paints trafficking as something that is easy to do and commonplace.

BIPOC characters are not treated so nicely or with, really, any sympathy, with it being a given that the one Black character lives in poverty, is a single mom, and gets told her child needs a father. The women who are used for sex and entertainment in this movie are all used in more or less the same way. The relatively homogenized feel of the abuse is honestly kind of weird. But it doesn’t seem to stop racial stereotyping. 

It’s clear that the movie is leading you to believe in its version is the sex industry, and it’s clear that you are supposed to walk away thinking that the sex industry is awful and everyone doing sex work needs to be saved but probably can’t be saved. Because it seems like everyone ends up stripping and being really sad about it. Drug and alcohol use are rampant, painting the scene as sex, drugs, and money. 

A weird sort of pivot in the movie is that the woman who trafficks a teenage girl she used to babysit, also has some steamy kissing and touching scenes with her. I can’t help but think if the main baddies in this movie were male, there would have been a lot of backlash. As it is, there is some weird overlay of a “feminist” odyssey in parts. 

The premise is shakey, the acting is a little overwrought in some areas and a little dead in others, weirdly, there is entertainment appeal. But overall I have to give the movie low marks for lazily resting on a narrative that does not reflect the truth of the material. 

Mildly odd footnote: Dominique Swain, of Lolita fame, is given a starring credit but actually appears just briefly at the beginning. 

Upgrade your membership
Upgrade your membership
Our strength comes from your support. Join as a member or upgrade today to become eligible for exclusive gifts!

Prostasia Foundation
18 Bartol Street #995, San Francisco, CA 94133
EIN 82-4969920
Facebook Twitter LinkedIn Tumblr Youtube Instagram
Modify your subscription    |    View online