History of the Encryption Debate
Federal Trade Commissioner Chopra began the event by reminding participants that we had “gone through this same debate 20 years ago… that [the Justice Department] is probably recycling some older arguments, and that it may not fully be taking into account how unencrypted communications may actually be a risk to the public.” These discussions are indeed not new—debates about encryption, individual privacy, and the role of government have circulated in U.S. policy for decades. This section will briefly review this historical context as a primer for understanding the panel’s discussion.
Until the mid-1970s and the development of public key cryptography, the government had a domestic monopoly on the use of electronic ciphers, but in 1976 researchers Whitfield Diffie and Martin Hellman discovered and published a paper that demonstrated how ordinary users could use a pair of related private and public keys to encrypt and decrypt plaintext conversations.1 Unlike previous methods of encryption, this technology allowed two or more parties to communicate privately and securely without ever having met before.2 Although these advances in cryptographic technology made encryption available to non-government users, it took societal factors—including the increasing use of personal computers by large companies, the demand for secure email technology for individual use, and increased use of portable and mobile devices—to bring the importance of encryption from academic conversations into the mainstream.
By the early 1990s the proliferation of encryption as part of modern communications provoked the first congressional attempt to force the inclusion of backdoors by electronic communications service providers and equipment manufacturers. A 1991 anti-terrorism bill included language requiring that companies “shall ensure that communications systems permit the government to obtain the plaintext contents of voice, data, and other communications when appropriately authorized by law.”3 Although this proposal was withdrawn, it was the precursor to multiple government efforts to find a way for law enforcement to reliably gain access to encrypted communications. The most notable early attempt at this was the failed introduction of the “Clipper Chip,” a then state-of-the-art microchip that could be inserted into consumer hardware telephones.4 One of the significant vulnerabilities of the Clipper Chip included its reliance on “key escrow,” where third parties held keys to the chip’s encrypted information.5 Ultimately the proposal failed when prominent computer scientist Matt Blaze discovered that the chip was easy to exploit, would not provide the security proponents argued it could, and instead would actually undermine the security of any device it was installed in.6
The Clipper Chip debate, and others occurring during that era, are often described as the “Crypto Wars.” This took place on the domestic front, but there was a simultaneous debate going on at the international level over U.S. export controls and encryption technology. Because they had historically been used by military and intelligence agencies almost exclusively, existing export control regimes classified cryptographic tools as “munitions” and restricted their export. These controls were based on the strength of the encryption and applied not only to hardware but also to encryption software and source code. The intelligence community was concerned that U.S. companies were selling international customers strong encryption technology, which many officials feared could reduce their ability to gather information on foreign targets. However, restrictions on the quality of encryption produced by big companies would all but ensure that surveillance targets would simply use other services, leaving everyone without secure options. This aspect of the debate continues even now, with Chopra noting during his opening remarks at the event that if the government restricted encryption “criminals and crooks are going to use encrypted communication, and the rest of us could be left vulnerable.” Ultimately, after much pushback from advocates, companies, and allies in Congress, these controls were liberalized at the end of the 1990s, which effectively ended the first Crypto Wars period with a blow against backdoors and a major win for consumer privacy.
Although those 1990s debates focused primarily on encrypted technology that was theoretically available to consumers, the reality is that encryption then played a dramatically smaller role in the lives of ordinary people than it does now. The use of email was increasing, but internet-connected devices, smartphones, online financial transactions, and a complete reliance on virtual communication via email or messaging tools were not even on the horizon. A few decades later, many millions of people rely on the security and privacy provided by strong encryption every single day. This rise of a second era of encryption debate was likely inevitable.
The encryption of data “in transit” protects information that is moving over a network from one place to another. End-to-end encryption, the most secure method of protecting this data, ensures that messages encrypted by the original sender can only be decrypted by the recipient—not by other actors who may intercept it or even by the company providing the messaging service.7 Encryption of data “at rest” protects information that is stored somewhere, like on a laptop, smartphone, external hard drive, or in the cloud, and is not moving from one place to another. Both modes of encryption are crucial to protecting consumer privacy, and both have been targeted for government interference.
In 2014, both Apple and Google announced that they would begin encrypting their smartphones by default, ensuring that users would have their data protected at rest without having to take any further steps to activate the feature.8 This was in the wake of the Edward Snowden revelations and increasing concern by customers that large tech companies were sharing customer data with the National Security Agency. These changes meant that not only was all the important data on a phone—photos, messages, contacts, reminders, call history—inaccessible to third parties, it was inaccessible to the companies who made the phones. Companies would no longer be able to provide information stored on devices to law enforcement upon request. This was a huge victory for consumer privacy and security, but it sparked instant backlash from the U.S. government, with the FBI accusing Apple of “[marketing] something expressly to allow people to place themselves beyond the law.”9
Given the increasing amount of personal data stored on smartphones (including health and financial data), privacy and security experts agree that further security for these devices is crucial to protecting privacy. The argument by some government officials that people who want to use secure devices are somehow nefarious, and that the reason consumers may be interested in an encrypted phone is because they want to evade law enforcement, does not reflect the reality of current threats to consumer privacy. If a mobile device stores emails, contact information, photos, location data, audio recordings, and apps that collect sensitive data, theft or loss of that device could have serious consequences for the owner. By contrast, if a device owner could trust that such information would remain private should their phone be lost, stolen, or seized by a government third party, they may prefer to purchase devices with those security features.
Law enforcement opposition to increasing implementation of strong encryption has also targeted end-to-end encrypted messaging services, similarly casting those as tools used by criminal actors rather than ordinary consumers who wish to protect their private communications. Proponents of law enforcement access to encrypted information have argued that it is possible to build a system that only allows “good guys” access to private communications—an assertion technical experts have repeatedly debunked. During the panel, Hannah Quay-de la Vallee, senior technologist at the Center for Democracy & Technology, highlighted one of the key problems with proposals for encryption backdoors—that it is possible to create mechanisms for only the “good guys” to access encrypted data—saying, “[the] fundamental aspect of this is not technically feasible… There's not actually a way to get 95 percent crypto, or 98 percent crypto. It's a little bit of an all-or-nothing situation.”
Specific events have regularly brought encryption into the public eye, including the terrorist attack in San Bernardino, California in 2016, following which the FBI demanded that Apple break the encryption on the shooter’s iPhone.10 More recently, U.S. government officials have cited the role of encrypted communications in limiting Facebook’s ability to report all tips about the online sexual exploitation of children in support of their demands for encryption backdoors.11 However, to this point no U.S. government attempts to legislate backdoors or restrict access to encryption have been successful. The role of government policy in both promoting and undermining encryption, and where these consumer privacy discussions stand now, will be discussed later in this paper.
Citations
- Whitfield Diffie and Martin E. Hellman, “New Directions in Cryptography,” IEEE Transactions on Information Theory, Vol. 22, No. 6, November 1976, available at source
- Peter Wayner, “A Patent Falls, and the Internet Dances,” New York Times, September 6, 1997, source doc/security/pkhistory.html
- Section 2201 of S. 266, The Comprehensive CounterTerrorism Act of 1991
- A. Michael Froomkin, “It Came From Planet Clipper: The Battle Over Cryptographic Key Escrow,” Chicago Legal Forum Law of Cyberspace (1996), available at source
- Andi Wilson Thompson, Danielle Kehl, and Kevin Bankston, “Doomed to Repeat History: Lessons from the Crypto Wars of the 1990s,” New America’s Open Technology Institute, June 2015, 5, available at source
- Matt Blaze, “Protocol Failure in the Escrowed Encryption Standard,” AT&T Bell Laboratories, 1994, available at source
- “A Deep Dive on End-to-End Encryption: How Do Public Key Encryption Systems Work?” Electronic Frontier Foundation, Surveillance Self-Defense, available at source
- Cyrus Farivar, ” Apple Expands Data Encryption Under iOS 8, Making Handover to Cops Moot,” Ars Technica, July 17, 2014, source ; Craig Timberg, “Newest Androids Will Join iPhones in Offering Default Encryption, Blocking Police,” Washington Post, September 18, 2014, source
- Craig Timberg and Greg Miller, “FBI Blasts Apple, Google For Locking Police Out of Phones,” Washington Post, September 25, 2014, source
- Leander Kahney, “The FBI Wanted a Back Door to the iPhone. Tim Cook Said No,” Wired, April 16, 2019, source
- “Attorney General Barr Signs Letter to Facebook From US, UK, and Australian Leaders Regarding Use of End-To-End Encryption,” United States Department of Justice, October 3, 2019, available at source