Why building backdoors into encryption won’t make us safer

For much of the last decade, technology companies have been in an uphill battle to save encryption, a battle that has seen an increasing number of skirmishes that tech companies often lose. Throughout this ongoing clash, governments across the world have been pushing to backdoor encryption in the name of combating child abuse and terrorism.

backdoor encryption

The battle has come to a head several times in recent years, including when the FBI demanded Apple assist in unlocking the encrypted work phone of one of the San Bernardino shooters in Dec. 2015, as well as after a shooting in Pensacola Florida in Dec. 2019. I don’t think you’d find a single person that is against helping law enforcement put actual criminals behind bars, but the collateral damage of these “anti-encryption” measures is simply too devastating to justify.

End-to-end encryption

Tech companies focused heavily on privacy and security in the 2010s and many rolled out products with improved encryption. Messaging platforms WhatsApp and Signal both added end-to-end encryption to their users’ communications in 2014. That same year, Apple enabled encryption by default in iPhones with the release of iOS 8.

While encryption can come in many forms, it always comes with the same goal: protecting data confidentiality. End-to-end encryption achieves that goal by setting up an encrypted channel where only the client applications themselves have access to the decryption keys. In the case of WhatsApp, this means that even though users’ messages might traverse or be stored on WhatsApp’s servers, the company doesn’t have access to the encryption keys that would allow it to decrypt and read those messages. The messages stay private to all but the sender and the receiver.

In the case of encryption-at-rest (like on the iPhone), the user’s password or PIN acts as the encryption key. When the phone boots up, the user has to enter their password or PIN to unlock the phone’s data. Any new data the phone receives or creates – like images or chat messages – are encrypted using that key. If the phone powers off or is put in a “lockdown mode,” the decrypted data is flushed from the phone’s memory and the user must enter their password again to unlock it.

The FBI and other law enforcement agencies around the world are asking Apple and other manufacturers to create a “golden key” (so to speak) with the ability to decrypt all messages on all devices. Australia even managed to pass legislation in 2018 that allows them to force companies to create backdoors in their encryption. While it is technically possible to accomplish that goal, the security and privacy ramifications would be massive.

The problem with driving to backdoor encryption

There’s simply no such thing as a “good guys only” backdoor. Eventually, a cyber-criminal will get their hands on the “golden key” or exploit the intentional chink in the armor to break their way in. The NSA losing its stockpile of Windows zero-day vulnerabilities in 2016 should be clear proof that we shouldn’t be so quick to trust government agencies to act responsibly with security.

Organizations rely on encryption to protect their intellectual property. Journalists rely on encryption to protect themselves and their sources from oppressive governments. You can probably imagine the amount of resources a hostile nation state would pour into finding such a backdoor if it existed.

What if we took a step back and examined the encryption debate using a physical safe as an analogy? People use safes to store important documents and items that they want to keep out of the hands of criminals. At the same time, people can also use them to store evidence of crimes. Should safe manufactures be required to intentionally add a weak point to every safe or create a master key? Or should law enforcement be required to go through legal channels to compel owners to give up their keys?

The former is exactly what governments are asking the Apple, WhatsApp and others to do. Law enforcement, at least in the US, already has the power to obtain massive amounts of data through the court system. In the case of the Pensacola shooter for example, Apple handed over iCloud backups, account information and transactional data for multiple accounts. The FBI eventually gained access to the phone in question without Apple’s help, calling into question why they need a backdoor at all.

Pushback against anti-encryption regulations has become strong enough that many governments are becoming much more covert about their attempts. Take the EARN IT Act for example. It was introduced to the US Senate earlier this year and while it doesn’t explicitly outlaw encryption, it sets up a government agency under the US Department of Justice that can define a checklist of “best practices” organizations must follow to remain under the protection from civil and criminal liability for its users under Section 230 of the Communication Decency Act. That list of best practices could easily include weakened encryption requirements, and likely will be based heavily on the desires of the current Attorney General.

Even if most governments managed to pass anti-encryption laws, criminals would simply move to different apps instead of the ones that maintain compliance. Giving up the security and privacy of the masses is simply too big of a price to pay for something that is very unlikely prevent crime and incredibly likely to result in abuse.

Don't miss