Kik is a messaging app launched by Canadian company Kik Interactive in 2010. It is free of monetary cost to users, but is not a basement operation; Kik Interactive has received over $120 million in rounds of traditional funding, on top of a 2017 initial coin offering of $100 million. Its main difference from other messaging services such as WhatsApp is that it allows users to create accounts and find friends through usernames without the need to link the account to a phone number.
This means that, if you want to chat with someone without giving them your phone number, you could tell them your username on Kik. It also means that, if someone chats with you on there, you have no way of knowing what their phone number is.
This past year, Kik has made news at all levels because of its use in the sexual exploitation of children. Forbes documented their creation of fake profiles of teenage girls, with which they joined public groups, and received harassing messages from men. In the United Kingdom, the BBC reported that Kik was featured in 1147 child sexual exploitation investigations in the last five years. The week that this post was written, there was press coverage of thirteen different instances of child predation that involved Kik in the United Kingdom, the United States, and Canada.
This coverage alone would paint Kik as an app from hell solely useful to child molesters; the “predator paradise” described by a convicted child molester to CBS News. This is, however, not the face of Kik I first encountered. Kik was instrumental for the operations of an organization I accompanied as a digital security trainer when all their other communications channels were being monitored. Kik is helpful for women my age who do not want to give their numbers to strangers on dating apps, fearing that they will be the next ones to tell stories of stalking and harassment to their friends. LGBTQI and kink-friendly communities benefit from the anonymity provided by the app to talk about their interests without the fear of having their offline identities linked to conversations that could then be used to shame them.
What gives, then? Do these adults have privileged communications at the expense of the safety of youth? I wholeheartedly believe that, like with most secure messaging technologies, the picture is more complicated.
Protecting children and protecting secure communication
Compared with other secure messaging applications, Kik is protected by encryption only at the transit level, but not end-to-end. This means that it can protect you from privacy violations by those who monitor your internet connection (State and organized crime actors if you’re a journalist or a dissident; school administrators if you’re a teenager). But because this encryption is not end-to-end, Kik Interactive could still theoretically read or copy your messages during the short period when its server routes them from one user to the other. This at least raises the question of whether the company gives information about dissidents to governments who request it, but also whether Kik conversations could be monitored to provide evidence in child abuse cases.
We have spoken about the value of a service like Kik for journalists and activists, as well as for adult populations who have been the targets of sexual violence: women, LGBTQI individuals, and kink practitioners. But what attracts the allegedly 40% of adolescents in the United States who use it?
Kik is a free app that enables youth to both have conversations with friends in their existing social circle and make friends beyond it. The fact that it is not linked to a phone number means that they can download the app on their tablets, sign up with a username and communicate with their friends that way. Unlike Facebook or WhatsApp, said username and a profile photo are the only bits of information that other users can see about you before you engage in conversation.
The application clearly makes a worthwhile value proposition to those making legitimate uses of it. But of course this means that it has also attracted people who want to abuse these users without having to face the consequences. If we cannot create gates that keep abusers out, what can we do to best protect the rights of youth who use Kik Messenger?
It becomes imperative to invest in safety when youth are some of your primary users.
We can start with the low-hanging fruit for Kik Interactive. Giving users the ability to block is an essential first step that they have taken; a couple of extra steps could involve UI improvements such as showing the blocking feature on the primary chat menu, and a reporting function attached to the blocking system. Platforms may avoid these highlighting reporting functions because they increase the load on their moderation teams, but although there are no simple organizational solutions or fail-proof content moderation models, it becomes imperative to invest in safety when youth are some of your primary users.
Another piece of low-hanging fruit is the inability to do searches by age, and I will recognize that Kik has come a long way with this. I ran a few obvious search terms on the app, and it filtered out all the results. This is a similar approach to that taken by search engines such as Bing and Google which since November 2013 have omitted search results that would contain child sexual abuse material (Prostasia Foundation is asking Yandex to take the same precautions). The child protection organization Thorn maintains a collaboratively developed “keyword hub” that it shares with Internet platforms to filter out such search terms commonly associated with abuse.
Finally, despite the fact that some adolescents and some adults do use Kik as a means to start respectful and consensual conversations with other people they do not know in their own age group, there is value to designing applications that pose communication with contacts you know as a default, and that more carefully filter out new conversations. Professor Sonia Livingstone OBE suggests that Kik gives youth more warnings about interactions with adults; this could be part of a floating box attached to all incoming messages from people outside one’s contact list.
But these seemingly simple technical fixes are not enough to address the underlying social dynamics about which it is our role as adults to foster conversations. As parents, youth workers and allies, we must start conversations about consent and boundaries early on to end the normalization of the idea that it is okay and expected to receive unwanted contact from adults. We need to make ourselves available for conversation when things go wrong so that youth will talk to us when they are experiencing things they don’t want, and, in our response, we should never blame them for their own abuse. There is nothing a child can do to deserve to become an object of sexual exploitation.
Sexual education has come a long way towards developing effective messaging to help children make sense of the risks and benefits of the sexual practices in which they will engage. As media literacy and technology advocates, we need to develop our own equivalents of progressive education that reaffirm youth’s rights to safety, privacy, and sexual and reproductive health. And we need to remember that the best approach to youth protection are those which scaffold and accompany their encounters with risk, rather than the ones that pretend to isolate them from the world they will one day inhabit as adults.
[…] number, and to establish private connections via those usernames. This privacy feature has been applauded by some as being important for journalists, activists and at-risk […]