Prostasia Foundation’s Hall of Fame highlights the efforts of organizations and individuals who have made a significant contribution towards protecting children from sexual abuse, while upholding human rights and sex positivity.
This year, our inaugural call for nominations came at a time when the world was adjusting to the new reality of children sheltering at home. The risk of mismanaging this new reality is high. Experts have expressed fears about children’s elevated risk of abuse within the home, as well as the dangers of additional unsupervised screen time—all while they are separated from the school environment, where signs of abuse can sometimes be detected early.
In this context, it is even more challenging than usual to address risks to children while maintaining a sex-positive orientation that centers the human rights of all. This year’s inductees have effectively risen to that challenge, and we honor them in this year’s inaugural edition of our Hall of Fame.
Inductees to the Hall of Fame are not ranked or rated, and new members will be added every year, joining those who are already there. A unique feature of our Hall of Fame is that we don’t simply valorize our inductees. We also address some of the criticisms that each of our nominees has received.
Our intention is that this Hall of Fame will amplify the work of some whose contribution towards the prevention of child sexual exploitation is often overlooked, because their approach may differ from that promoted by more prominent child safety stakeholders. A future in which child sexual abuse has been eliminated will require innovative, multi-sectoral approaches to its prevention. This year’s inductees have demonstrated courage and foresight in promoting their own diverse visions of this future.
Cloudflare is a web infrastructure and security company, best known for its content delivery network that enables websites to load faster, while protecting them against attack. Cloudflare joins our Hall of Fame this year because of its introduction of a new, simple CSAM Scanning Tool enabling its members to automatically monitor their websites for known illegal sexual images of minors, and to report those images to the authorities.
Although other tech companies such as Microsoft and Facebook had provided access to components of a CSAM scanning solution before now, these were not drop-in solutions. They required significant technical knowledge to integrate them into a website’s systems, and we have found some errors in their documentation. Access to Microsoft’s tools and to the databases of image hashes that they utilize is also restricted, and can be refused on a somewhat arbitrary basis.
Cloudflare’s CSAM Scanning Tool solves these problems, by enabling any Cloudflare member website to turn on scanning with the click of a button. This simplicity does come at the cost of some control for the website administrator—there are only two hash databases available, and the administrator doesn’t have the choice to adjust the “fuzziness” of what counts as a match. However, Cloudflare has foreshadowed that it intends to provide more granular control to website administrators in the future.
Cloudflare has been criticized—including by one of our other inductees into this Hall of Fame, the BADASS Army—for requiring a high standard of proof that a website’s content is illegal before it will withdraw its services from that website. That’s because Cloudflare regards itself as an infrastructure provider akin to the telephone service, rather than a content platform such as Facebook. As such, there are only a couple of outlying cases, including the Nazi website the Daily Stormer and the extremist-friendly message board 8chan, in which Cloudflare has withdrawn service to a website on the basis of its content.
Cloudflare’s reasoning makes sense. Illegal content ought to be removed from the Internet at the source—but there must also be checks and balances to ensure that the removal is conducted in an accountable and transparent fashion. Given the potential for Internet infrastructure providers to be used as a chokepoint to enable extra-legal censorship of a broad range of content, there is good reason to reserve the removal of entire websites to legal authorities.
More isolated instances of illegal images appearing on websites should not be tolerated either, but our approach to their removal should be a more targeted one. Cloudflare’s CSAM Scanning Tool provides such a targeted approach to platforms of all sizes, and Prostasia Foundation recognizes this achievement by including Cloudflare in our inaugural Hall of Fame.
Association for the Treatment of Sexual Abusers (ATSA)
The Association for the Treatment of Sexual Abusers (ATSA) is a membership organization for professionals dedicated to the prevention of sexual abuse through the effective treatment and management of those who have offended sexually or who are at risk of doing so. ATSA joins our Hall of Fame this year as the peak international professional organization for the prevention or sexual abuse, including the sexual abuse of children.
ATSA holds an annual conference attended by researchers, clinicians, social workers, and representatives from the justice system. Prostasia Foundation attended this event last year to present on legal and psychological issues around sex dolls; a topic that other conferences have avoided due to the stigma surrounding it. But ATSA welcomes researchers on such stigmatized topics, such as the effects of pornography, psychological and social factors behind child sexual abuse, and the management of juveniles who offend.
ATSA publishes a journal, two Practice Guidelines for treatment providers for adult and juvenile offenders respectively, and a Professional Code of Ethics. It also hosts a professional online forum and conducts training events.
Over time, ATSA’s choice of name has become a little outdated; “sexual abusers” is not person-centered language and suggests that someone who has perpetrated a sexual offense, even as a minor, must always carry that mantle even as they work towards rehabilitation and reintegration into society. But in practice, if not in name, this attitude is one that ATSA members have done much to dismantle, by shedding light on the complexity of the factors that can lead people from all walks of life to commit acts of abuse, and the real possibility of effective interventions to prevent them from doing so.
No organization is immune from harboring those who could perpetrate abusive acts, and ATSA itself is no exception. In the past year, ATSA members were saddened to learn that a member of ATSA, and a past leader in the organization had been charged with a sexual offense. His membership was suspended pending the outcome of the charges. Although such revelations are damaging, ATSA members know better than most that there is no simple profile that marks out those will offend, and that it is the rule, not the exception, that someone who offends will have been trusted by their family, friends, and colleagues up until the moment that their perpetration is uncovered.
We recognize ATSA in our Hall of Fame this year for its ongoing work to bring together the world’s leading experts to apply an inclusive and evidence-based approach to sexual abuse prevention. In a political and media environment in which this problem is frequently over-simplified, ATSA refuses to do so. Year after year, its members are building up a comprehensive program of social, legal, and personal interventions with the aim of eventually seeing the scourge of sexual abuse eliminated from society.
Internet Watch Foundation (IWF)
The Internet Watch Foundation is the United Kingdom’s Internet reporting hotline for unlawful sexual images of minors, maintaining databases of image hashes and URLs that its members can use to block abuse images, and the websites on which they are found. There are several aspects that make the IWF unique globally. Under statutory authority, it hires a full-time team of analysts who proactively search for abuse images to be added to its databases, rather than relying entirely on public reports. It also makes these databases available on subscription to private sector members from around the world.
In the past year, the IWF reached an agreement with NCMEC, the United States reporting hotline that also operates under a statutory scheme, to share their respective databases of hash values—essentially, digital fingerprints that can uniquely identify images of child sexual abuse, even if they have been subtly altered. By increasing the comprehensiveness of their joint data set, this agreement is expected to allow more abuse images to be automatically eliminated by subscribing Internet platforms.
The IWF has provided measured criticism of campaigns led by the British press and the government-sponsored child safety group NSPCC that would place rigid requirements upon Internet companies to proactively eliminate content that could be harmful to children. In its response to the UK government’s Online Harms White Paper, the IWF put forward the view that “new regulatory framework must encourage companies to continue sharing their experiences and solutions.” The toning down of the government’s initial “tough on tech” rhetoric, to favor a more collaborative approach in itsfinal White Paper, can be attributed in part to the IWF’s intervention.
In this regard, it hasn’t passed unnoticed that the IWF is predominantly funded by the Internet companies who are its members. However it addressed this in a recent podcast episode, which outlined how much industry is already doing to solve the problem of unlawful sexual images of minors, and explained the trade-offs that make “doing more” a less simple proposition than it appears.
Prostasia Foundation has criticized INHOPE, the organization of reporting hotlines of which the IWF currently holds the chair, for its refusal to rule artistic images outside of its remit as an organization. We have made it very clear that the conflation of such artistic images with actual visual records of child sexual abuse is ethically unjustifiable and harmful. The IWF, indeed, does maintain a database of what UK law inaccurately describes as “non-photographic child sexual abuse images” that its members can choose to censor.
Even so, the IWF itself manages this tension in a thoughtful and deliberate way. Its register of non-photographic images is not intermixed with images of real abuse, and it does not monitor reports of such images that are hosted outside of the UK, or provide reports of such images to foreign authorities. These are both commitments that INHOPE has declined to adopt as best practices for its membership.
Compared with other hotlines, the IWF has shown greater dedication towards ensuring its own accountability and transparency, and has given a higher priority to upholding human rights values, such as through the appointment of human rights expert Andrew Puddephatt as its Independent Chair. For these reasons, we are pleased to include the IWF as an inductee into our inaugural Hall of Fame.
Family Online Safety Institute (FOSI)
The Family Online Safety Institute (FOSI) is an international membership organization for private sector companies to collaborate with government and nonprofit stakeholders on online safety and privacy issues. We are recognizing FOSI in our Hall of Fame this year for its courage in exposing the shortcomings of the politicized EARN IT Act, and encouraging policymakers to take a more evidence-based approach to child online safety in its various dimensions.
Its April 2020 webinar Protecting the Vulnerable Online: Why Encryption is Key was held one month after the EARN IT Act was introduced, and played a significant role in educating policymakers on shortcomings of this law. FOSI’s inclusion of those who would be harmed by restrictions on encryption set its webinar apart from similar events such as the March hearing on the EARN IT Act in the Senate Judiciary Committee. For example, Carlos Gutierrez from LGBT Tech shared on how encryption helps to ensure the safety of LGBTQ+ communities, and Elaina Roberts from the National Network to End Domestic Violence (NNEDV) discussed the importance of safe and secure communications for those trying to get out of an abusive relationship.
This isn’t an aberration for FOSI, which has long championed a shared culture of responsibility for Internet safety, bucking the conventional wisdom that Internet companies hold the keys to keeping children safe, and that tough laws are needed to ensure that they execute this responsibility to the government’s standards.
Instead, FOSI encourages all segments of society, including parents and children themselves, to rise to the challenge of living safety in the online environment. This sometimes means broaching controversial ideas—such as that being less controlling of our children online, rather than more so, can help them to build the lifelong skills that they need to navigate the network safely. Although this is a view supported by experts, it is seldom heard from child safety groups that have an agenda of censorship and centralized control to advance.
The biggest structural weakness of FOSI as an organization is that its members are technology companies. This inevitably opens the door to criticism that its recommendations may be partial towards the interests of those companies. FOSI also has work to do on its diversity; its panels are often all white and predominantly male. In 2020, this clearly isn’t acceptable, and there is much room for FOSI to improve here.
Even so, we recognize FOSI in our Hall of Fame this year because the quality of its work and the nuance that it manages to bring to it—no mean feat in Washington, DC—shine through. It’s easy to oversimplify child safety and privacy issues, and to allow the loudest and most conservative voices to dominate conversations on these topics. FOSI resists this pressure, and the dialogues that it hosts are all the richer because of that.
The BADASS Army (Battling Against Demeaning & Abusive Selfie Sharing) is an advocacy and support group by and for survivors of image-based sexual abuse. We honor them in our Hall of Fame this year for their work to combat the non-consensual sharing of explicit personal images of adults, as well as the sharing of unlawful sexual images of minors.
They describe their multi-pronged approach towards doing this using the acronym LEET—legislation, education, empowerment, and tech. Legislation can ensure that image abusers are able to be prosecuted for their crimes. Education can help all involved stakeholders to recognize image abuse for what it is, how to limit the opportunity for perpetrators. Empowerment for survivors includes providing them with the tools to protect themselves and to retake control over the violation that they have experienced. Technology tools can be used to prevent image-based abuse, and also to fight back.
Like many other victims, the BADASS Army have been harshly criticized—ironically, most often by people calling themselves feminists—for sexualizing some of their campaign graphics, messages, and merchandise. But the very point of their approach is that the sexuality of the survivor isn’t wrong or shameful. When someone shares intimate images of themselves with another, that is an act of trust that should be honored. When it isn’t honored, the fault and the shame lies only with the person who has betrayed their trust. Survivors of image-based abuse are entitled to reclaim their sexuality, and they should be allowed to do this on their own terms.
Unlike many groups that purport to fight against the unwanted sexualization of women, the BADASS Army doesn’t oversimplify the problem, and their message is not a proxy war on pornography or sex work. They are honest about their intentions, and they are transparent about their approach. While we may not agree with them on every point, the BADASS Army is an authentic, principled, and hard-working group of survivors who are taking their grassroots message to the halls of power. For that, the BADASS Army deserve our respect and a place in our inaugural Hall of Fame.
Disclaimer: Logos used on this page are for identification only and do not imply any endorsements or affiliation between the brands shown and Prostasia Foundation.