Six (More) Technical Reasons Why Undermining Encryption is a Bad Idea

Two weeks ago some of the biggest names in cryptography published a paper on the threats posed by intentionally weakened encryption. Titled “Keys Under Doormats: Mandating insecurity by requiring government access to all data and communications,” these experts strongly agreed that legislators must reject any proposal to turn back the clock to the encryption control proposals that were at the center of the 90s Crypto Wars. The report was widely covered in the media, (including a front page story in the New York Times) but, the press missed one of the most interesting aspects of the paper: the complex technical aspects of why intentionally inserting vulnerabilities/weaknesses in encryption is a terrible idea.

We have already seen why backdoors are bad for privacy, the economy, and civil liberties. Last week’s report compellingly argues that there are huge technical problems in implementing the U.S. government’s desire for weakening encryption to facilitate surveillance, which the report authors refer to as “exceptional access.” There is no golden key, there is no easy fix, and here are six reasons why:

  1. It breaks forward secrecy: The concept of “forward secrecy” means that decryption keys are only used for a single message – they’re deleted immediately after they are used – limiting the threat to user privacy if a system is breached. If this isn’t the case, then hackers could use a stolen key to read your iMessages from yesterday, last week, or three years ago. Backdoors require the government to be able to access all past communications with a warrant, so maintaining forward secrecy isn’t possible.

  1. You can’t sign your name: Encryption isn’t only about hiding content, it can also be used to verify the that a particular person sent a message and even that the message hasn’t been tampered with. We call these features the authenticity and integrity of messages. The theft of an escrowed key means that any new messages can’t be trusted because they could be from someone else, or could have been silently changed in transit.

  1. It’s harder than telecommunications surveillance: Backdoors in encryption don’t work the same way as Internet surveillance. Different Internet and telecommunications services tend to use similar technologies, whereas there are hundreds of thousands of developers who would need to deploy and test new tech features in very different types of software. Also, because you aren’t supposed to know when this surveillance is happening, security testing isn’t practical or effective.

  1. It puts all the keys in one basket: A manufacturer, government agency, or a third party company will need to store information that the government can use to decrypt communications. An attacker who breached this system, a juicy target for any hacker looking to gain access to huge numbers of systems, would be able to compromise nearly everyone’s data at once. Think of the OPM breach, or any of the other recent high profile hacks, but including every message you’ve ever sent to your doctor, your employer, or your best friend.

  1. You can’t make everyone comply: There just isn’t a practical way to detect or deter companies that don’t use backdoors. Do we block encrypted apps with a national firewall? Make it illegal for individuals to use them? Force companies to get their products approved by law enforcement? This might work for big companies, but imagine every Silicon Valley startup applying to a regulator before they can launch their newest product.

  1. The Internet is global: Because the Internet is global, these proposals pose global tech and legal problems. Even if backdoors could work for U.S. tech companies (which they can’t), users could still download non-compliant software from other countries. How do we protect against attacks based on location spoofing? How do we design mobile phone systems to only give host jurisdictions access?

Some of the best computer scientists and security experts in the world are telling us that there is no way to grant law enforcement access to encrypted communications that would not also unintentionally enable others to access them as well. Law enforcement is not only asking for technology that doesn’t work, but technology that puts the very same people that they want to defend from terrorists, cyber criminals, and foreign governments at risk. In 1997 many of these same experts wrote a paper that helped to win the Crypto Wars, telling us that the government’s plan for backdoors didn’t work - and they’re telling us the same thing today. Without answers to these complex technical questions, legislators should learn from the lessons of the 1990s and embrace strong encryption as a tool to make everyone's communication safer and more secure.


Andi Wilson is a policy analyst at New America’s Open Technology Institute, where she researches and writes about the relationship between technology and policy.