WhatsApp Threatens to Quit U.K. Over Online Safety Bill’s Nuking of Encryption Privacy

WhatsApp, Signal and five other messaging services have joined forces to attack the Government’s Online Safety Bill, threatening to leave the U.K. market over fears that the bill will kill end-to-end encryption and open the door to “routine, general and indiscriminate surveillance of personal messages”. Matthew Lesh in the Spectator has more.

Encryption provides a defence against fraud and scams; it allows us to communicate with friends and family safely; it enables human rights activists to send incriminating information to journalists. Governments and politicians even use it to keep their secrets from malicious foreign actors (and their colleagues). Encryption should not be thrown away in a panic.

The Government has responded to these concerns by declaring that the bill “in no way represents a ban on end-to-end encryption”. This is technically true but deceptive. The bill gives Ofcom the power to require services to install tools (called “accredited technology”) that could require surveillance of encrypted communications for child exploitation and terrorism content.

Advocates claim this is possible without undermining encryption – by installing tools for scanning for certain content on a user’s device. However, just as one can’t be half pregnant, something can’t be half encrypted. Once a service starts reading messages for any purpose the entire premise of encryption disappears. A paper from fifteen computer scientists and security researchers in 2021 explained it is “moot” to talk about encryption “if the message has already been scanned for targeted content”.

With respect to child exploitation material, messages could be checked against the PhotoDNA database. But that only contains historic photos and videos and cannot be stored on devices. It means creating a software vulnerability, that could be exploited by malicious actors, and sending data back to a central database to check whether it is a match. Alternatively, companies could use machine learning to detect nudity, which would need to be reviewed by authorities. But that has a high rate of failure. Just last year, a father lost his Google account and was reported to the police after sending a naked photo of his child to a doctor.

Some contend that privacy should be sacrificed in the fight against child abuse. But there are clearly limits to this logic. Few would consent to the state putting CCTV in everyone’s bedroom to crack down on the abuse of children. But that is effectively what a technology notice could mean: a CCTV camera in everyone’s phones. Ofcom could even be able to require the use of scanning technology without independent oversight (unlike the Investigatory Powers Act, which at least requires authorities to seek permission from a tribunal and is, generally, targeted against a specific individual rather than mass surveillance).

Message scanning is open to serious mission creep. There will be enormous pressure to scan communications for other purposes, from ‘disinformation’ in the U.K. to any unsanctioned material in authoritarian countries. This is why platforms, who do not want to create a vulnerability in their product or set a global precedent for their billions of users, really could leave the relatively small U.K. market because of the bill. The shutdown of WhatsApp in particular would be a political disaster for any Government, and not just because ministers and MPs would lose their main communications platform; millions of people who use it across the country will also lose theirs.

Read More: WhatsApp Threatens to Quit U.K. Over Online Safety Bill

Leave a Reply

Your email address will not be published. Required fields are marked *