Mozilla, DNS-over-HTTPS, and child abuse

A closeup of a browser URL bar with a globe icon to the left. The camera is angled so that the text becomes blurry and eventually leaves the screen as the UR.L continues

Last month, the UK Internet Services Providers’ Association nominated Mozilla as being one of its 2019 internet villains, accusing it of undermining Internet safety standards that help to protect children. What is this all about? Well, it’s about a technology called DNS-over-HTTPS, that Mozilla is trialling as a way to make your Internet usage more private and secure.

Whenever you use use the Internet, your computer is silently translating domain names such as prostasia.org, into numbers that represent the address of a server. It does this using a public decentralized database called the DNS (for domain name system), also called the “telephone book of the Internet.”

Looking up addresses in this database is currently done in a way that isn’t very private or secure; for example, your Internet provider (ISP) can see what websites you are looking up, and can just as easily block you from looking up the address of a website or even return a false address to a website, kind of like in net phishing. Historically, these short-comings of DNS have even been used as an excuse for forcing ISPs to block websites to protect children. Some countries have even mandated by law that ISPs must do this.

But ISP-level blocking was never wise or reliable, because it relies on known weaknesses in the DNS technology. A new enhancement to the DNS called DNS-over-HTTPS (or DoH) would fix these weaknesses. Originally proposed by Mozilla and further developed at the Internet Engineering Task Force (IETF), DoH is intended to guarantee stronger security and privacy for end-users, as well as more transparent choices on who an end-user can trust return reliable results to their computer’s DNS queries. In many cases DoH is also more efficient, resulting in websites loading faster for users.

But because it doesn’t work well with one of the current methods for ISP-level blocking of child abuse images, it has inspired a rich debate in the technical community. Concerns are also being raised, including in the United Kingdom House of Lords, relating to a possible disruption of existing internet filtering and blocking schemes in the UK that aim to protect children, the most vulnerable members of our society, from harms. But DoH would not interfere with parental control software, the favored technical tool by experts for protecting children, and were DoH to get deployed more widely and be configurable in the browser it could even enable greater competition between parental control providers.

Last week the Internet Watch Foundation (IWF) spoke up with its own concerns, arguing:

[T]he IWF provides a URL blocklist, which allows internet service providers to block internet users from accessing known child sexual abuse content until it is taken down by the host country. The deployment of the new encryption system in its proposed form could render this service obsolete, exposing millions of people to the worst imagery of children being sexually abused, and the victims of said abuse to countless sets of eyes.

But in a joint submission sent to the UK government last week, Prostasia Foundation and freedom of expression organization Article 19 reached a different conclusion. Our submission explains why:

We believe that if DoH became widely adopted in the UK, this would merely hasten the natural obsolescence of DNS-based blocking, which was never fit for purpose to begin with. Given the inadequacy of DNS blocking the UK’s reliance on this technique has created a false sense of security. DoH exposes this inadequacy, but it has existed all along. Children deserve better.

To understand why this is so, we need to step back a bit and look into more of the technical details behind DNS-over-HTTPS and its predecessor.

The DNS problem, and the DNS-over-HTTPS solution

Today, DNS services are bundled with internet service provision. That is, the commercial entity responsible for looking up the virtual location of a server in the DNS database on behalf of the end-user is normally the internet service provider (ISP).

The ISP may be a residential ISP, a mobile operator, or a WiFi network at a café, railway station, airport or in the public library. Effectively, the end-user has a different DNS service provider every time they change the network, and in practice non-expert end-users cannot verify who provides their DNS at any given time.

DoH opens the possibility for end-users to more easily choose the DNS provider, for instance a web company or an independent company. It makes it possible to choose a trusted provider and keep that provider over many different networks in a way that is resistant to being overridden by the ISP.

In addition, it limits the vulnerability of the user to surveillance of their Internet usage by encrypting their DNS lookup requests. Specifically, it makes it harder for third parties other than the DNS provider to discover the websites or other Internet services the user is accessing.

One consequence of DoH is that an end-user who chooses to trust a DNS provider other than their internet service provider (ISP) or mobile operator (MO), may end up with a DNS provider who is not covered by internet filtering obligations that ISPs and MOs are subject to under UK law, such as blocking websites identified as containing images of child abuse. This, in a nutshell, is why Mozilla has been accused of abetting child abuse.

The reality however is that DNS-based blocking has always been a fragile technique for Internet filtering. Even without DoH, users can and do easily circumvent DNS blocks using methods like Virtual Private Networks (VPNs), the Tor browser, or even simply manually changing their computer’s network settings.

More robust methods for the elimination of illegal content exist:

  1. Take down, don’t block. Illegal content should be traced to its source, and those who spread it prosecuted. Every blocked URL or filtered connection is a URL or connection which does not lead to police investigations of those who prey on the most vulnerable among us.
  2. If ISPs must block at all, then use hash-based filtering which targets the most egregious material that has already been identified as being illegal by a relevant authority.
  3. More fine-grained, customizable filtering is possible using parental control software. Experts favor the use of such tools over centralized blocking by ISPs—which is both easy to circumvent, and difficult to customize.

None of these measures would be affected by DoH. The consequences that the IWF is objecting to is the effect on its URL blocklist—which uses the obsolete blocking technique that haphazardly tries to block child abuse content by matching the website address at which it was last seen. But just last month, the IWF URL blocklist caused a major UK ISP to block the entire Imgur website, potentially affecting millions of users.

Even blocking illegal content using an image hash is much less prone to such embarrassing errors. The IWF offers a separate database of image hashes that ISPs can use to prevent illegal images from entering their systems, and this won’t be affected by the introduction of DoH.

Technology alone will not prevent child abuse

But ultimately, we cannot rely on technology alone to solve the problem of exposure to illegal or unwanted sexual content online. Recent child protection efforts turn away from internet filtering and blocking as a sustainable solution for child welfare. Through robust research and interaction with real children, it has been demonstrated that trust relations between adults and children and pro-active communication strategies instead work best to prepare children for the digital world.

The Swedish Prince Carl Philip and Princess Sofia’s Foundation together with BRIS (Children’s Rights in Society) recent Parents’ Guide To Kids Onlineis an example that we included in our joint submission. The guide presents itself in this way:

Kids and teens worry about things online they feel parents don’t have a clue about. They might also fear getting blamed for things that have happened, especially if they have done something they know their parents are concerned about. Parents, on the other hand, often feel they are lacking in knowledge about new apps or trends. This gap between kids and grown-ups is the biggest hurdle to conversation. And that’s what we want this book to change.

The goal of the book is to give parents tools to address difficult topics with their children on the children’s own terms. It does not contain any mentions of Internet filtering and blocking, because it turns out children do not want or need Internet filters. Rather, they need adults who care and understand.

In addition, more attention needs to be given to changing the behavior of adults who seek out unlawful material online—not by throwing technical roadblocks in their way (which can inevitably be circumvented), but by educating them about the impacts of their behaviour, and providing them with the resources and support that they need to avoid offending.

Conclusion

We need to take the protection of children seriously. But that’s why it’s folly to rely on an outdated approach like DNS blocking, which isn’t a serious response at all.

It also falls into the trap of assuming that there is a foolproof technological solution to every problem. It’s long past time to break out of that mindset and to realize that protecting children from sexual harms has to be approached holistically, with active intervention along the way from parents, sex educators, and supportive peer groups—not by handing control over to machines.

There are some legitimate concerns about how to get DoH working better with the existing, less private and secure DNS. But the argument that we should reject this advancement in technology because it places children at harm is false. With DoH in place, everyone—children included—will have access to a more private and secure Internet. We consider this to be much more of an opportunity for child protection than a threat.

Notable Replies

  1. Avatar for blii blii says:

    I generally prefer to disrupt things at the source (distribution and production) rather than playing the endless game of cat and mouse with viewers (which sometimes leads to viewing an image having heavier consequences than a contact offense) which modern politicians love the most.

    There isn’t much information here, but this is kind of interesting, snuffing out every means of monetizing this content would be good, especially as it deincentivizes distribution. There might also be other ideas which go by the wayside.

    As ads run scripts on the page anyway, perhaps they could scan around for specific hashes and to terminate the ads, if it finds one and report the incident to law enforcement? They might also spot other warning signs and report suspicions?

    This might be complicated as the majority of the content on a site may be safe, so there might be kinks to work out and it may be co-opted for other purposes like sniffing around to see if someone has a pirated movie. Reporting to law enforcement is regardless a good course of action, proportion of content and timely removal could be factors. If a site is particularly at risk and legitimate, then perhaps it could implement measures for expedited removal to co-operate with industry’s own efforts.

    Distribution of content hashes might be one option, but it might be risky if these were to get into the wrong hands, as someone pointed out, as they could literally just mutate an image, run it through the hash a thousand times and see when it stops working. This could happen completely automatically without anyone’s involvement.

  2. It should however be possible test that legitimate images are not included. Currently, we can’t have much confidence in that, because NCMEC is including known legal content in some of the hash list data that it supplies to companies, and the IWF was unable to confirm whether or not its lists include cartoon images that are legal in many countries. We are following up formally with both of them as part of our upcoming report on transparency in the child protection sector.

Continue the discussion at forum.prostasia.org

Participants

Avatar for terminus Avatar for prostasia Avatar for blii

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.