鈥淯nbounded and Unpredictable”
A National Academies of Sciences Report Lays Out The Risks of a Crypto Backdoor Mandate
鈥淯nbounded and unpredictable.鈥 That is how Matt Blaze, professor at the University of Pennsylvania, described the risks associated with mandating a encryption backdoor at a recent National Academies of Sciences (NAS) workshop, titled 鈥淓ncryption and Mechanisms for Authorized Government Access to Plaintext.鈥 The workshop featured a series of discussion sessions ranging from the policy trade-offs of mandating backdoors, to the economic and market impacts of such a mandate, to the technical feasibility and consequences – and the-ever nagging question: is a solution plausible, even if it is technically 鈥減ossible鈥? This two-day exercise was attended by current and former government officials, technologists and cryptographers, and civil society experts, including OTI鈥檚 director Kevin Bankston. A report-out of their discussions is . 聽
The draws no conclusions from the convening; it merely summarizes what was debated. However, the warnings issued by many of the participants were clear: encryption backdoor mandates would be dangerous to the economy, to civil liberties and human rights, and to cybersecurity. The following is a summary of some of the major issues that were discussed at the workshop.
Law Enforcement鈥檚 Desire for a Backdoor
Chris Inglis, the former deputy director of the NSA, began the workshop with some introductory remarks, as summarized by the report, that concluded with the idea that 鈥渢here has never been a perfect encryption system; even when the math is correct, implementation is never perfect, and as a result, encryption will never meet all expectations.鈥 Despite the fact that he believes there will always be an opportunity to exploit a vulnerability and bypass or hack encryption, he urged that the workshop seek common ground for a new system that provides sufficient security, while also meeting the needs of law enforcement to establish a means of 鈥渆xceptional access鈥 – what we at OTI would call a backdoor.
James Baker, the FBI鈥檚 general counsel, followed-up Inglis鈥 comments, emphasizing the problem law enforcement faces as more communications it seeks to obtain as evidence for investigations are protected by more widely available encryption. He noted that the FBI is not seeking a specific technical solution and that the American people must decide what tools the FBI should have, but stressed that 鈥渁lternative strategies [i.e. not backdoors] may slow down investigations, lead to larger resource requirements, or yield less complete information than would otherwise have been obtained.
Inglis argued, and Baker agreed, that this debate about whether law enforcement should be guaranteed access to encrypted communications with legally mandated backdoors is 鈥渋mportant enough to make it worth going beyond what seems practical and exploring what is possible.鈥 However, the remainder of the workshop demonstrated that most participants, after carefully considering all alternatives that were put forth, believe that secure encryption backdoors are neither practical nor possible.
Human Rights Implications of Backdoors
Patrick Ball, the director of research at the Human Rights Data Analysis Group, kicked off the first session of the workshop with a discussion about how encryption backdoors would threaten human rights. Ball stressed that a government-mandated encryption backdoor would 鈥渋ncrease the amount of government surveillance without making any of us any safer, and that the downsides of this increased surveillance would be experienced most acutely by vulnerable populations.鈥 He warned that such a mandate would not only directly harm Americans, but would also harm U.S. policy around the world 聽by undermining the Department of State鈥檚 funding of encryption and anonymization tools for use by human rights groups. It would also threaten the security and activities of journalists and activists around the world, and particularly those in repressive regimes like Syria, Russia, Iran, China, Venezuela, throughout east Africa, and parts of the Middle East.
Ineffectiveness of a Policy Solution
In addition to human rights concerns, Ball argued that any U.S. mandate for encryption backdoors would be ineffective in accomplishing law enforcement鈥檚 goals. The and services are either open source or developed abroad. This means that no matter what U.S. policy is, anyone who wants strong encryption, including sophisticated criminals and terrorists, will be able to access it. Indeed, as OTI鈥檚 Bankston pointed out at the workshop, of the encrypted messaging apps recommended by ISIS are either foreign-developed, open-source, or both.
Inglis acknowledged this difficulty, and conceded that while there may be no 100% solution, a fix that would get law enforcement most-of-the-way-there might still be possible. But, he recognized that even a partial fix would require a large-scale international agreement on how to implement such a backdoor requirement. Orin Kerr, professor at the George Washington University Law School, was dubious that such an agreement would ever be politically feasible, and several technologists cautioned that secure implementation of such an agreement would be exceedingly difficult and likely impossible.
After discussing policy-based arguments for and against encryption backdoors, the workshop turned to the question of implementation: is a secure system of exceptional access even feasible? The answer of many of the workshop attendees was no. 聽聽
Encryption for Me but Not for Thee
First, the attendees questioned whether it would be possible to segment encryption policy 鈥渧ertically,鈥 by only allowing un-backdoored encryption for certain types of users or types of data. This idea was shot down by Eric Rescorla from Mozilla, who questioned whether it is feasible to 鈥溾榳all off鈥 the use of strong encryption in specific sectors.鈥 As an example, he offered that people use the same web browser to contact medical and financial services providers, which we would want to be protected with stronger encryption, as they use when connecting to social networking platforms, which law enforcement would argue should be protected by weaker encryption.
Marc Donner, formerly of Google Health, also noted that such a system could require institutions to build their own encryption tools, which would be highly unlikely. Currently, 鈥渆ncryption used in various sectors [is] based on standard, commercially available products.鈥 Several technologists raised the concern that imposing a backdoor requirement would severely degrade the security companies choose to offer. This is because it is unlikely that companies would devote the significant additional resources that would be necessary for developing, maintaining, and defending against attacks on inherently weak encryption. Instead, it is more likely that they would forgo using encryption to protect their products and their users altogether, lowering security for everyone.
Several participants, such as Rescorla, 聽found the idea of segmenting strong vs. weak encryption by user 鈥渆xtremely unpleasant.鈥 Participants also found that kind of segmentation was dangerous since there are a wide range of people who may be targets for nation-state and criminal hackers and who would need the protection of strong encryption, not just people who handle sensitive financial or health data.
Susan Landau, professor at Worcester Polytechnic Institute and one of the organizers of the meeting, argued that segmenting by user is impossible and any attempt to do so could have negative effects for national security since the Department of Defense is dependent on commercial products. Others noted that additional, non-governmental high-value targets include, but are not limited to, CEOs and anyone doing business internationally, further undermining the idea of preventing those parties from having strong encryption. 聽
Building upon that concern, Bankston noted that 鈥渙ften the point of the attack is not to access an individual鈥檚 information but to use the compromise of an individual鈥檚 system as a platform from which to attack the valuable information of others.鈥 This makes segmenting by user impractical. Since high-value targets are in all sectors, and each sector is only secure as its weakest link, establishing a backdoor anywhere in the system creates an attack vector that could impact mailroom staff, CEOs, and government officials alike.
Landau then moved the group to assess whether it would be possible to segment encryption policy 鈥渉orizontally鈥 by technological layer, e.g. by requiring backdoors at the operating system or platform layer but not at the application layer, or vice versa.
Again, the conclusion was that this would type of segmentation would be infeasible. ACLU Senior Staff Technologist Daniel Kahn Gillmor cautioned that if the U.S. mandates that all operating systems include encryption backdoors, someone will develop a more secure operating system abroad. Other security experts argued that platform-based backdoors would likely still be ineffective because applications could be developed in such a way as to avoid the backdoor, such as by requiring a key to access the application that the user remembers, rather than stores on their device. Finally, there were concerns raised that such a system would require software engineers to undermine the integrity of programs by adding significant complexity, and thus, unintentional vulnerabilities.
In sum, then, most of the experts agreed that requiring backdoors at a lower layer鈥攍ike the hardware layer or the OS layer鈥攚ould not accomplish law enforcement鈥檚 goal because targets could simply implement encryption at a higher layer, and would also have the side effect of making computing products inherently less secure in unpredictable ways. Therefore, the consensus of many attendees was that such horizontal segmentation of encryption policy approaches was not a reasonable solution.
New Approaches to an Old Idea: Key Escrow
Since vertical and horizontal segmentation would not provide reasonable paths forward, the workshop attendees turned to the question of whether a complex key escrow system, what engineers would call a 鈥渒 out of n鈥 escrow system, could be workable. This proposal would establish two layers of encryption. The first layer of encryption would protect user data, and would include a backdoor – or an 鈥渆ncryption key鈥. The second layer of encryption would include a set of 鈥渟ealing keys鈥 to protect the 鈥渆ncryption key鈥 and a set of 鈥渦nsealing keys鈥 would be held by multiple individuals around the world. The operating concept behind this system is that by increasing the number of people who have keys, the chances of the 鈥渦nsealing keys鈥 and the 鈥渆ncryption key鈥 being stolen would be reduced. Additionally, the chances for abusive uses of the 鈥渆ncryption key鈥 would be reduced since any one individual鈥檚 capability to decrypt information would require the consent of all other individuals holding the 鈥渦nsealing keys.鈥
Matthew Green, a cryptography professor at Johns Hopkins University, cautioned that it is unlikely that the security of this type of a system could ever be verified, and it would be impossible to know when or if the keys had been compromised. Moreover, any system this complex would almost certainly have vulnerabilities similar to those seen in previously proposed key escrow schemes. Other security experts pointed out that as a system being put in place for law enforcement needs, it would need to be regularly accessed by countless people, including compliance personnel from the relevant companies and the many federal, state and local law enforcement officers seeking access get evidence decrypted. Any system that has to accommodate regular access by such a large number of parties while also maintaining the confidentiality of all of the keys would necessarily be very complex and much harder to secure. Finally, participants emphasized that there would still be a significant risk that nation-state or criminal actors could steal the 鈥渦nsealing keys鈥, as any holder would become a prime target for hackers.
After completing the discussion around a 鈥渒 out of n鈥 key escrow concept, the workshop turned to the question of whether a secure key escrow system could be made more secure by storing the encryption key on the device itself. The merit to this approach is that it would reduce security risks by requiring that any party that seeks to exploit the backdoor have possession of the device in question in addition to needing to get the unsealing keys from the escrow authority. Bankston agreed that this approach could reduce some of the security risk, but remained concerned that it would not address the human rights threats that backdoors raise, citing as an example, the possibility the state actors like China鈥攖he same state actors most likely to compromise a key escrow facility鈥攁lso have the capability of seizing very large numbers of devices.
Landau also argued that inserting backdoors in device hardware would be a dangerous approach. She argued that mobile devices, and smartphones in particular, are increasingly becoming authenticators – tools that we use to prove we are who we say we are when accessing highly sensitive information, such as those used when signing into banking, medical, or email accounts or when being used as a second factor of authentication. This means that if a hacker steals the device and accesses the encryption key stored on it, they could not only break into the device itself, but could also use that device to gain access to sensitive information about all aspects of a person鈥檚 life or business. 聽聽
The Costs of an Encryption Backdoor
The assessment of the technical feasibility of various approaches for encryption backdoors led to a discussion of the costs of implementing any of those proposed solutions. Blaze issued a stark warning. He told the attendees in no uncertain terms that the many problems we face with our security architecture today stem from the , which distracted researchers and engineers from building stronger systems and prohibited widespread use of certain strong cryptographic algorithms because they were classified as a munition and subject to severe trade restrictions until 2000.
Our world is exponentially more connected today than it was in the 鈥90s, and will become even more so with the rise of the Internet of Things. Blaze and other technologists were adamant that the security costs of imposing a backdoor requirement now would be enormous, not only because of the inherent risks that such a system could be exploited, but also because of the opportunity cost associated with devoting resources to developing and maintaining a system that intentionally weakens security rather than strengthens it.
Bankston also raised serious concerns, which were echoed by several other participants, about the negative economic impacts of such a mandate. Consumers are increasingly expecting and demanding strong encryption as a means of securing themselves against growing cyber threats. Additionally, while large companies might be able to bear the burden of a backdoor mandate – though as the technical discussion sessions made clear, it would be an extremely heavy burden – smaller companies would either fail due to the cost of implementation, or would choose to forgo providing security through encryption altogether. This would inevitably result in a loss of market competitiveness and put their users at increased risk. 聽
Bankston put it best when he 鈥渟uggested that the United States can either invest hundreds of millions of dollars to update law enforcement鈥檚 investigative capabilities for the 21st century or the economy can face a loss of billions of dollars if exceptional access is mandated for U.S. products.鈥
If Backdoors Aren鈥檛 Possible or Practical, What is to be Done?
Bankston鈥檚 comments raise the question of what exactly 鈥21st century capabilities鈥 entail. 聽There was broad agreement that the FBI must be provisioned with adequate staff and resources to meet its technological needs. Right now, the FBI only has 39 staff that carry out activities to respond to the investigative challenges posed by encryption and anonymization technologies, only 11 of whom are agents, and $31 million in funding for those activities (which may increase to $38 million but with no increase in staff).
Additionally, FBI staff need better technical training so that avoidable mistakes, like the one made with the , do not happen. Lastly, several of the attendees argued that those capabilities should include law enforcement hacking for investigative purposes. The government hacking into devices and networks raised concerns and questions among some attendees, including the question of what requirements the FBI would be under to disclose , and the concern that it may be infeasible for state and local law enforcement to acquire the expertise and tools necessary to conduct such so-called 鈥渓awful hacking.鈥 Bankston also made clear that any such approach would necessitate 鈥渁 substantial policy and legal debate…to define responsible, reasonable, and constitutional government hacking.鈥 聽 聽
This workshop reinforced that there are no easy answers to the dilemma that the FBI and state and local law enforcement claim they face. The one answer that comes across clearly in the report is what over two years ago: encryption backdoors were a bad idea in the 1990s, and they are an even worse idea today. Instead of continuing down this road to nowhere, the time has come (and gone and come again) to stop talking about whether a secure backdoor can exist, and start talking about what is both possible and practical: how to ensure that law enforcement at all levels of government can evolve to meet the demands of a fast-changing world, where new technologies bring endless new opportunities, and at times, some new obstacles.