The Government is trying to undermine the use of encryption by big technology companies in its draft Online Safety Bill, an industry expert has claimed.
Meredith Whittaker, president of secure messaging service Signal, said a provision in the Bill focused on encryption had “significant issues” around the mass surveillance of private information.
End-to-end encryption is a security measure that protects data and communications by scrambling them, meaning only the sender and recipient are able to read the data.
It is widely used to safeguard sensitive information, with Signal and fellow messaging service WhatsApp among its high-profile users.
The proposed Online Safety Bill, which is next due to be heard in the House of Lords on Thursday, will aim to tackle illegal and harmful content online by imposing new legal requirements on big tech companies.
This includes giving the regulator Ofcom greater powers to monitor private information that was previously encrypted.
Ms Whittaker told the PA news agency: “The bulk of this could be remedied by adding a sentence fragment to the provision in the Bill, that simply clarifies that Ofcom will not use this power to force an entity to adulterate or undermine its end-to-end encryption. That’s really what we are asking for.
“Lord [Daniel] Moylan has tabled an amendment in the past that would do just this.
“What I heard from Damian [Collins MP, former technology minister] is that he and others are unwilling to do that because they want to be able to force [Facebook’s partner application] Messenger and other services not to implement encryption.
“So it’s very clear there is not just an attack on encrypted services, but a play to prevent other actors from operating this privacy.”
Home Secretary Suella Braverman told the House of Commons on Monday that allowing Messenger to introduce encryption could hinder efforts to proactively detect and report instances of child grooming and other abuse material online.
However, Ms Whittaker said countries around the world were looking to the UK to take the first steps in such legislation, and passing the Bill as it is would set a precedent that authoritarian governments will follow.
“Encryption is a technology that protects privacy and expression, and it either works for everyone or it is broken for everyone, leaving infrastructure vulnerable to hackers, to exploitation and to social control if those powers fall into the hands of the wrong regime,” she said.
Describing the Online Safety Bill as “extraordinarily important”, Ms Whittaker, who started her career at tech giant Google, said the legislation was also vital from a business perspective.
WhatsApp owner Meta threatened to withdraw the service from the UK last March rather than submit to the proposed policies.
She said: “I think there is also a real economic threat that this legislation poses to the UK’s desires to be a leader in technology and AI.
“I’m hearing from AI leaders that I know that there is a concern that this Bill reveals that the UK Government does not have a coherent position on technology.”
Despite her concerns around encryption, Ms Whittaker said the Online Safety Bill more broadly also contained some “very positive” provisions and remained hopeful a resolution could be found.
She said: “I would at this moment register cautious optimism. My sense is this provision was snuck in at the last minute – it was only added in September [2022] and that was a fairly chaotic time in the UK.
“People with a lot on their plates didn’t necessarily understand the significance of what was happening – it flew under the radar.”
Children’s charity the NSPCC has accused some tech companies of trying to “pit children’s fundamental right to safety against the privacy rights of adults”, and said polling suggested they are “out of step” with the public on the issue.
The charity said its YouGov survey of 1,723 adults across the UK in April found “overwhelming public support for measures to protect children from abuse in private messaging”.
Almost three quarters (73%) said they felt technology companies should be required by law to use accredited technology to identify child sexual abuse in end-to-end encrypted messaging apps, while 79% said firms should develop technology that allows them to identify child abuse in such apps.
Richard Collard, head of child safety online policy at the NSPCC, said: “Most tech companies already scan for child sexual abuse on their apps and messaging services leading to more than 1,000 children being protected from sexual abuse every month.
“It is now clear that companies who wish to pit children’s fundamental right to safety against the privacy rights of adults are out of step with the public and, ultimately, their user base.
“Tech firms can show industry leadership by working with regulators, child safety advocates and safety tech companies to invest in technology that protects both the safety and privacy rights of all users.”