Professor Nigel Smart at Zama warns that backdoors don’t just threaten individual privacy, but could backfire on governments and cause a global security risk
Everyone uses systems that provide end-to-end encryption, whether this be WhatsApp, Signal or encrypted backups for your phone. And when we say everyone, we mean everyone; not just individuals, but also companies, law-enforcement agencies, and governments.
End-to-end encryption protects data (messages, files, calls, or any other information) so that only the sender and recipient can access it, preventing interception by third parties, even service providers and governments.
So to allow someone access to end-to-end encrypted data, there needs to be a change to the protocol. This change is called a backdoor. The idea is that once the backdoor is inserted, then the ‘special person’ can access the contents without the need for the participants in the communication to know.
Essential or damaging?
Are encryption backdoors necessary for security or do they weaken online systems, undermining everyone’s s afety?
For a number of decades now, Governments in the UK, US, and across Europe have been pressuring tech companies to allow backdoors in the name of national security and law enforcement; from the FBI’s battle with Apple over unlocking a terrorist’s iPhone in 2016, to the latest example - as reported by the Washington Post - which claimed that UK government officials secretly ordered Apple to build a backdoor so they can have blanket access to users’ encrypted iCloud data.
According to the police, backdoors are useful in investigations, whereby special access would be provided to them, under some (often ill-defined) legal framework, and bad criminals would be caught, while the good guys would be OK.
The problem is, it’s not quite that simple. End-to-end encryption is both complex and subtle by nature: you can’t just add a backdoor that only the good guys can access and the bad guys cannot.
Instead, what we end up with is a system that’s weakened, affecting the security of everyone - individuals, companies, government, and even the law-enforcement agencies wanting to use these systems.
Then there’s the political dilemma: if one government, say the UK, demands a backdoor to access encrypted data, what happens when France, Spain, the U.S., Russia, China, or even North Korea follow suit? Once a tech company weakens its encryption for one country, what prevents others from doing the same? A backdoor created for China, for example, wouldn’t just affect users in China, it would also compromise the security of Western visitors, exposing their data to the same vulnerabilities.
Of course, the Washington Post article claims even more, saying the UK government wanted a backdoor to see the iPhone encrypted cloud data of ALL accounts, irrespective of who the owner was, and where the phone was located.
Why should the UK government have access to the data of other countries’ citizens, and companies? Indeed low-grade government communication is often made over commercial end-to-end encryption products, so such access would also give the UK government access to other countries’ low-grade government communications. Thus by asking for this, the UK government opens the door to other governments asking for this, ironically reducing its own security in the long-run.
What is strange is that this news comes a few weeks after the US FBI and CISA (CyberSecurity and Infrastructure Security Agency) issued a warning to Americans to encourage them to use end-to-end encryption due to the Salt Typhoon attack on call and phone records in the US.
Is there a way to avoid backdoors?
With all this in mind, it begs the question - with technology developing so fast - is it possible to avoid backdoors, yet still get access to data for law enforcement when it is needed? It’s an intriguing question that’s ended with almost all solutions to this problem failing thus far.
One proposal, pushed by some governments in the EU, is to do so-called ‘client-side scanning’. In this scenario, a user would have some form of government program on their phone that scanned all the messages sent/received for bad content, such as child pornography, terrorist data, etc. Once such data was detected, the government agency would be alerted and the police would be able to catch the bad guy.
This sounds great, but suffers from huge problems. The bad guys would simply not use such phones, or delete the code doing this. So it would only be operated by the honest citizen, in which case why bother?
Such automatic detection software in practice would also produce many false positives, meaning the police would need to investigate more innocent people instead of concentrating resources on the known criminals.
Finally, special scanning software like this risks ‘mission creep.’ By this, we mean a government led by Party X could expand its use to monitor conversations or data related to supporters of Party Y, or for any other purpose. This is something that happened in the UK, where security cameras installed to prevent crime were later used by a council to check if residents put their bins out too early.
The bottom line here is that scanning content and flagging specific material to others before encryption fundamentally undermines the very purpose of end-to-end encryption, which is to ensure true privacy and security.
Innovation in cryptography
Rather than weakening encryption with backdoors or flawed scanning systems, governments should focus on privacy-preserving technologies that balance security with individual rights. The field of cryptography has evolved to offer solutions that protect privacy while enabling secure data processing.
End-to-end encryption already ensures data is protected in transit (WhatsApp, Signal) and at rest (iPhone and Google cloud backups). The next step is protecting data during processing; an area where advanced cryptographic techniques like Fully Homomorphic Encryption (FHE) are making strides.
However, while FHE allows computations on encrypted data without exposing the underlying information, it is not a silver bullet for government demands. Some have proposed using FHE for client-side scanning, but this is still fundamentally flawed.
Even if it were technically possible, it would suffer from all the issues mentioned above (mission creep, false positives, only used by the good guys etc). By its very nature, client-side scanning, no matter how implemented, defeats the purpose of end-to-end encryption, and as such could weaken people’s already fragile trust in the digital technology that they use.
Rather than governments pushing for invasive measures that ultimately weaken security for everyone, the conversation should instead be about how innovation in cryptography can support both privacy and security in a way that doesn’t introduce systemic risk.
Professor Nigel Smart is Chief Academic Officer at Zama and a leading expert in cryptography
Main image courtesy of iStockPhoto.com and Hailshadow
© 2025, Lyonsdown Limited. Business Reporter® is a registered trademark of Lyonsdown Ltd. VAT registration number: 830519543