Is censoring online porn the best way to keep children safe?

If we want to make sure children are safe online, let’s stop them watching porn.

Does that sound like a sensible proposal? Or even a realistic one? While, on reflection, such a proposal seems a little strange, this has formed the basis of public discourse and policy discussion around “child online safety” for a considerable time.

Children are upset by cyberbullying. Well, let’s stop cyberbullying then! Even though defining cyberbullying in either a legal or algorithmic way is almost impossible because context is important.

Children see upsetting content related to suicide and self harm. We’ll call on the social media providers to remove this content to stop them seeing it. Even though that content remains available across the Internet.

Children are being exposed to pornography and we’re not sure what the harmful effects are. Let’s call on service and pornography providers to make sure children can’t access this content. Even though age verification is a flaky technology at best and children know all the work-arounds.

It’s easy to agree on that outcome we all want: that children should be safe. But splashy newspaper headlines and political slogans conceal a lot of hidden complexity about how to best realize this objective.

How the United Kingdom develops policy on child online safety

Here in the UK we are experiencing interesting times around “Online Safety”. At the time of writing we are awaiting the publication of the Government’s Internet Safety Strategy white paper, which will set out their policy position and legislative plan around making the UK: “the safest place in the world to be online” as set out in the 2017 Conservative Party Manifesto.

I spend a lot of my life talking to children and young people about how and why they go online, their hopes and fears in the online space, and the risks they take. Given that “online safety” has become the de facto term used in schools around education related to digital literacies, I often ask what their own thoughts on the term are, and I can remember one 10 year old boy saying to me once “what do you mean by safe anyway?”. While these children are very conversant with “online safety” messages and can repeat them parrot fashion, for example:

What goes online stays online.

Think before you post.

It’s illegal to be on Instagram until you’re 13.

The more fundamental principles about what it means to be safe seem to have evaded their learning. As an aside, I often challenge teaching staff on the “its illegal to be on social media until you’re 13” only to be met with arguments around child safeguarding and development, showing scant knowledge of the real reason, the US Children’s Online Privacy Protection Act.

Returning to the point in hand, the concept of safety is one that is rife in UK policy, yet seems perhaps to be used out of turn. A dictionary definition of “Safe”, drawing the appropriate designation, is: free from harm or risk: unhurt. Are we really hoping that we can ensure that children and young people can be free from harm and risk whenever they go online? This seems like a fairly ambitious and unrealistic proposal.

Regardless, following the outcome of the 2017 general election, the Government moved forward with their manifesto commitment with an Internet Safety Strategy Green Paper that set out priorities and a call for consultation with stakeholders on the intended policy direction, with the intended end result of this being the production of the White Paper, on which we are still waiting.

The four main priorities established in the green paper were:

  • setting out the responsibilities of companies to their users;
  • encouraging better technological solutions and their widespread use;
  • supporting children, parents and carers to improve online safety;
  • directly tackling a range of online harms.

With three underpinning principles:

  • what is unacceptable offline should be unacceptable online;
  • all users should be empowered to manage online risks and stay safe;
  • technology companies have a responsibility to their users.

Reflecting on these priorities and principles we can see a focus on industry responsibility and technological intervention. The green paper went to great lengths to highlight the technical nature of the online environment and the need for industry to “take responsibility” for what goes on across their platforms with the lion’s share of the paper focussing on industry as the primary stakeholder for “safety”.

What happens when children access pornography?

In terms of access to harmful content, one of the most interesting things to note from the green paper is the almost total lack of comment around young people’s access to pornography. In the UK there has been an almost pathological obsession with preventing children from accessing pornography since a headline in a popular tabloid news outlet in 2010 that claimed a third of ten year olds have viewed pornography. Arguably, this single headline has driven a policy position that still exists to this day: We have to stop children accessing pornography.

I can, on the one hand, understand the political position – it is far easier for a politician to make a name for themselves with soundbites such as “We have to stop children looking at pornography”. A far less punchy quotation, and a far riskier policy position in terms of winning over tabloid journalists, would be “Children are accessing pornography and we know, technically, this is very difficult to prevent. Therefore we need to develop an education strategy that allows discussion and critical thinking about what is being viewed when someone accesses pornography”.

There also seems to be a black and white perspective on this policy – you either support it, or you want children to see pornography. For those of us with some level of knowledge about how the Internet actually works, we are not challenging this view to be difficult, we are challenging it because we know it won’t work! Filtering has been in place in schools in the UK for quite a while, and, as a result of policy pressure all Internet Service Providers now give access to similar tools for the home environment. However, take up in homes is not great, because the tools tend to be cumbersome and keyword/URL remains the leading edge, meaning that a sexual keyword, regardless of context, will result in a page or site being blocked. This has led to many sites containing highly useful information around sex education, sexuality and gender, reproduction, sexual health and similar to be blocked.

I should stress that I myself am not massively liberal about youth access to pornography. While there is a dearth of research around the impact of pornography on child development, I have had enough conversations with teenagers to know it can be harmful. For example, a discussion with a 14 year old boy, who was experiencing both size and performance anxiety while never having had sex, highlights the potential impact. Moreover, many girls raise concerns about “expectations” of partners who have been exposed to pornography. Clearly there are concerns, risks and harms that will affect some young people. And I would prefer it if they were not accessing it.

However, what I am clear about is that technical interventions will not prevent access, and will also have a harmful impact on children’s rights. But, regardless of these concerns, voiced by many, the UK government presses ahead with stronger and stronger legislation around technical intervention to prevent young people accessing pornography. It is, perhaps, for this reason that the green paper contains no mention of further discussion regarding how we might tackle young people’s access to pornography. They believe, with legislation is place, this particular problem has been resolved.

Age verification and the Digital Economy Act

In 2017, the Digital Economy Act reached the statute books and one section defined some frankly bizarre measures aimed at creating a porn-free utopia for young people. Section 3 of the Digital Economy Act requires any pornography provider operating in the UK on a commercial basis to put age verification in place to “prevent” minors from accessing their services and content. A regulator has been appointed to ensure this takes places, and organizations are threatened with heavy fines for non-compliance. The legislation and political discourse makes it clear that it is down to industry to find the “solution” to comply with the legislation. In order to keep children and young people “safe” from pornography, we need only ensure that they do not see it, rather than supporting young people to think critically about pornography and sexual content online.

In order to keep children and young people “safe” from pornography, we need only ensure that they do not see it, rather than supporting young people to think critically about pornography and sexual content online.

Even with a watertight age verification solution (which this will not be), this would only prevent children and young people from accessing pornography from commercial sites, not accessing, for example, the huge volumes of pornography that can be viewed via social media, blogging sites or shared using peer-to-peer techniques. While a detailed exploration of all the technical flaws in this legislation would make this a very long article, I shall put forward a simple one, which has been expressed to me many times by young people: “What about proxying?”

Simply, in order to put age verification in place for UK users, the server needs to recognize the request has come from the UK. With simple tools on their device, an end user can “proxy” their IP address so the server thinks the request is coming from someone else. If 14 year olds can see the flaws in the legislation, it is worrying the law makers do not!

When I talk to children about the “solutions” to the pornography problem, they are very clear – it has to come from education. They want better education, delivered by knowledgeable and non-judgmental practitioners, and more opportunities to talk about and ask questions about the complexities of the digital world without the risk of being “told off”, getting “into trouble” or the “threat of prosecution”. What they are given instead are technical interventions that we know will not work, and will have a negative impact upon their rights. If we consider the UN Convention on the Rights of the Child, article 3 clearly states the best interests of the child are paramount. Yet we continue to approach online safeguarding with a dystopian view that will not achieve its aims, and equally it will erode the rights of the child in the process.

Informed, progressive, critical relationships and sex education, delivered by staff who are trained in the field, is what children are asking for, whether this relates to pornography or any other aspect of “online safety”. Why is this so difficult to achieve, and why do legislators believe that they know best? I would suggest that this is because we have lost the youth voice in this debate. Children and young people are having policy and legislation placed upon them, because we “know best” and we have to keep them safe. We cannot keep children safe online, however, we can give them the knowledge to understand the risks they might face, and how to mitigate against them. But we will not do that with technical intervention, we will only do that with education and informed discussion.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.