Why we need to reset the debate on end-to-end encryption to protect children

by Emma

Last week, the National Society for the Prevention of Cruelty to Children (NSPCC) released a report in a bid to raise understanding of the impact of end-to-end encryption (E2EE) on children’s safety from online sexual abuse.

It aimed to reset the debate that has framed children’s safety against users’ privacy, with heated arguments doing little to shine a light on a solution that works in both these critical interests.

Why we need to reset the debate on end-to-end encryption to protect children

We will always unapologetically campaign for children to be recognized in this debate and ensure that their safety and privacy rights are considered when platforms roll out E2EE. Children are one in five UK internet users – it’s legitimate they have a voice in the decisions that affect them.

It’s necessary because private messaging is the frontline of abuse, yet E2EE in its current form risks engineering away from the ability of firms to detect and disrupt it where it is most prevalent.

While E2EE comes with privacy benefits, there is one group of users whose privacy rights are put at-risk – children who have suffered or are at risk of sexual abuse.

These children have the right to have images of their abuse removed by tech firms if they are shared on their platforms. They have the right not to be contacted by offenders who recognize their profiles from these pictures and videos. And they have the right to a safe online environment that minimizes the chance of them being groomed to create these images in the first place.

Most major tech firms use tools to detect child sexual abuse images and grooming on their platforms, such as Microsoft’s PhotoDNA. This allows child abuse images to be rapidly identified and removed if users upload them, including private messaging.

PhotoDNA technology scans an image only to determine whether it includes child abuse and is no more intrusive than spam filters. At the same time, machine learning is also used in a proportionate way to identify new child abuse images and grooming.

The rise in self-generated images, where children share photos themselves often following grooming and coercion, makes this technology crucial to tackle abuse early and ultimately protect young users.

At the NSPCC, we have been clear from the start that we are not against E2EE. However, we believe tech firms have to protect all users and should only roll it out to guarantee these technological safeguards are not useless.

The response to our report shows exactly why this debate needs to be reset, with absolutist arguments around privacy leading to accusations that are often confused or inaccurate. One of these accusations calls for backdoor access to E2EE messages by law enforcement, which we are not.

While it is crucial law enforcement can build evidence to prosecute child abuse, this debate often emphasizes only the investigation of abuse after it has occurred.

Social networks currently play a vital role in protecting children from abuse, and we are more concerned about their ability to detect and tackle child abuse at an early stage.

This is why we want to see tech firms invest in finding engineering solutions that will give tools similar to those currently used to detect abuse the ability to work in E2EE environments.

Cyber security experts are clear that it should be possible if tech firms commit their engineering time to develop a range of solutions, including “on device” and other technical mitigations.

Our polling suggests the UK public does not subscribe to the either-or argument of privacy versus children’s safety and that support for E2EE would almost double if platforms could demonstrate children’s safety would not be compromised.

Yet as long as this debate continues to be framed as a zero-sum issue, no one’s interests will be well served – and decisions could be taken that reinforce unhelpfully polarised viewpoints.

It is in the interest of everyone engaged in this debate to achieve a balanced settlement for E2EE that protects the privacy and safety of all internet users, including children.

This must balance the range of fundamental rights at stake – recognizing this is both a societal and technological issue. This may be dismissed as mere rhetoric, but it’s the truth in terms of such an incredibly complex issue.

Related Posts

Leave a Comment