The Problem with Encryption Backdoors

Cybersecurity is full of concepts that seem wonkish or technical, but are actually deeply impactful for everyday people going about their lives--whether they’re email users at a large enterprise business or simply private individuals who care about their data privacy. Encryption backdoors, for instance, put everyone’s data at risk every day, yet your average iPhone user probably couldn’t define the term for you.

This isn’t shocking: most people ignore cyber security concerns outright, and even those who take them seriously often don’t want to commit to a lot of laborious manual effort when it comes to key management, establishing trust, and performing identity management. The trouble is, when we want things to be invisible from an operational perspective, we often settle for solutions whose inner workings are invisible as well. This is where people can start to get into trouble--and it’s where dangerous encryption backdoors are able to gain a foothold and begin to compromise your data security.

What Is an Encryption Backdoor?

Any method of circumventing encryption or authentication can be considered a backdoor. Oftentimes, this is something that’s intentionally built into a system. Just as frequently, however, the backdoor is something that hackers, corporations, or governments “find” after the fact--whether that’s in the form of a security vulnerability or a way of pushing software updates.

I’ve already written about Crypto AG in the past. This particular scandal represented one of the most brazen examples of the former kind of backdoor. Governments and other organizations thought they were purchasing devices with strong encryption that would keep prying eyes away from their data, but in point of fact the devices were built in such a way as to give the CIA and BND access to that information whenever they wanted it. These were black-box devices, so there was by and large no way an organization could see that there was something amiss. In Yugoslavia, they seemingly uncovered this “bug” and fixed it on their own end, and it seems that for that reason most of the Eastern Bloc countries never fell victim to the spy tactics, but the fact remains that for five decades entities in 100 countries placed their trust in encryption technology that was intentionally flawed.

Though Crypto AG is the most recent (and perhaps the most flagrant) example, they’re certainly not the only ones who play this game. There have been strong suggestions that Huawei utilizes these kinds of backdoors with its devices. And let’s not forget the NSA snuck an algorithm that contained a backdoor into the NIST encryption standard, essentially ensuring that people created faulty encryption systems throughout the world as a result. This isn’t quite the same as the government putting its seal of approval on a piece of malware, but it’s not far off, either.

The latter kind of backdoor is probably more common, and it’s perhaps best exemplified by the famous case of Apple vs. The FBI in 2015, following the San Bernardino terrorist attack. The FBI wanted Apple’s assistance in opening one of the assailant’s phones (it would have involved Apple pushing an OS update to the phone in order to change the limit on unlocking attempts--this would have given the feds the ability to break the 4-digit password through brute force). Apple refused. The FBI filed a case against Apple to compel them to comply, but it was withdrawn when they found a third-party company that was able to uncover and exploit a vulnerability in the phone to decrypt the data on behalf of the FBI.

Though Apple didn’t ultimately comply with the FBI, the mere fact that they could have opened up the phone is itself a kind of backdoor. If Apple truly wanted to get the data off of a phone that was only encrypted by a 4-digit keycode, they could do so no matter what--meaning that no user’s information is truly safe from Apple on an iPhone without additional encryption protection.

Why Do They Exist?

For the user who wants privacy and data-security, an encryption backdoor is an obvious violation of trust. Those who purchased Crypto AG’s devices never would have done so if they had known that their data wasn’t really secure--meaning that at the end of the day these backdoors exist solely for the benefit of those who want to access information that others have taken to the trouble to make private.

Law enforcement agencies tend to love backdoors for that reason. Throughout history, some criminal cases have been broken by a landline phone call that’s been tapped, or a random note left in a calendar, or some other physical piece of communicative information. Now, it’s all digital and password protected. This is called the “Going Dark” problem, where law enforcement agencies have the legal authority to intercept and access communications and information with a court order, but they lack the technical ability to do so.

If they can’t intercept in real-time, the argument goes, or access any data that’s stored, then they can’t get the crucial evidence they need, and the suspect walks free. This line of reasoning says that cases of murder, theft, human trafficking, and so on, will be stalled or fail because of lack of access.

But this argument is suspect on its face. Why? Because it’s an all-or-nothing approach--it guarantees that no one’s information is private (except those who take extra steps to utilize encryption programs that don’t have backdoors), meaning that due process is thrown out the window alongside privacy.

Encryption does not stand in the way of the law either. It is always possible to target a suspect and get the information they are looking for (nobody can fully protect their devices from a targeted attack by organizations, like NSA, FBI, … and that is ok), but first the authorities should get the necessary approvals to spy on confirmed suspects.

What should not be acceptable, is the attempt to monitor and tape every innocent citizen by undermining encryption and to make the world an ‘unsafer’ place in the process. Similarly, we do not build safes out of Swiss cheese (instead of solid steel) because criminals might use one to hide their activities. There are lot more safes in the world used for good reasons by honest banks, companies and individuals than by criminals. And the same applies to good encryption.

The Real Problem With Backdoors

Again, no user wants to utilize encryption software that has a backdoor--whether that’s a direct line to the CIA or to the Mafia. So the right question isn’t “Why do they exist?” but “Why are they so prevalent?” And the simple answer is that most encryption technology is a complete blackbox, meaning that the user doesn’t have any way to see how her data is being protected. She just has to take it on faith, to believe the profit-motivated company that’s selling the encryption solution when their marketing materials say that the product truly ensures privacy (remember Zoom?).

Without insight into the actual source code by which the data is being encrypted, however, there’s no way to be sure that the encryption is safe and robust. Furthermore, there’s no way to know that there isn’t a backdoor of one kind of another lurking within the software. Even if the vulnerability is in no way due to malice, it’s still a vulnerability that endangers your confidential information. If you can’t check for vulnerabilities yourself (and have groups of trusted experts check it for you), any encryption technology could still pose a risk.

Trust, or Verify?

Here you might raise an objection: “I’m not a computer scientist, a cryptographer, or a mathematician--I have to trust the experts, which is why I rely on external certifications and reputation, rather than code review.”

My response to that is simple: NIST was a well-respected crypto certification, and it turned out to certify solutions with vulnerabilities intentionally baked in. Crypto AG had a phenomenal reputation--and as a Swiss company it was perceived as neutral--and yet it was responsible for the biggest data security scandal of the century. They were a profit-motivated business with ‘special’ owners that offered no visibility into their technology, and the result was embarrassing for virtually all involved customers.  

This is why trust and certifications are no substitute for transparency. When code is open source, peer reviewed with published results and allows reproducible builds, there’s nowhere for backdoors to hide. Only once you have full published code audits and reproducible builds to show for your technology have you really earned anything resembling trust.  

Considering how things tend to turn out when profit-motivated businesses offer blackbox encryption solutions, it’s hard to imagine that informed consumers would want to settle for anything less than the level of transparency we’re describing.

How p≡p Ensures the Safety of You Data

At p≡p, we make sure you don’t have to worry about encryption backdoors. Our mission is to offer a secure wall for your data, with absolutely no cracks or doors for anyone else but you, and it will remain that way. It’s a decentralized, fully automated, end-to-end encryption for your email, and we’re completely open to telling you how it works. In fact, we want you to know. Our software isn’t an encryption black box and we’ve made the Open Source code available to everyone. Here, and only here, is there no expectation of privacy. We seek out code audits, publish the results and our builds are reproducible.

No alt text provided for this image

And what will that code base show you? You’ll see that between you hitting send on your email, and your recipient opening it, your information is securely encrypted, including metadata and attachments. That ensures confidentiality, integrity, and authenticity for your secure communications. It even cuts down your exposure to phishing. Our software automatically generates public and private keys for all of your corporate email users, but because our service is truly end-to-end and peer-to-peer, there would be no way for us to access those keys even if we wanted to. Even as the p≡p software manages your encryption behind the scenes and swaps out keys periodically, your security will remain assured even though the security measures themselves are invisible to email users, SWIFT admins, and others who might be sending encrypted messages.  

For us at p≡p, privacy is paramount and we will never have access to your keys or information. Everyone deserves to feel secure in their communications, and we want it to be as effortless as possible. Our vision is to make all end-to-end written digital communications secure by default. And we’ll continue to work towards it, and remain open and honest with our customers.

If this sounds better than leaving security up to chance and external certification, leave your thoughts below and/or drop us a line at p≡p Security.