In recent months, both Deputy Attorney General Rod Rosenstein and FBI Director Christopher Wray have been calling for holes in encryption law enforcement can drive a warrant through. Both have no idea how this can be accomplished, but both are reasonably sure tech companies can figure it out for them. And if some sort of key escrow makes encryption less secure than it is now, so be it. Whatever minimal gains in access law enforcement obtains will apparently offset the damage done by key leaks or criminal exploitation of a deliberately-weakened system.
Cryptography expert Riana Pfefferkorn has released a white paper [PDF] examining the feasibility of the vague requests made by Rosenstein and Wray. Their preferred term is "responsible encryption" -- a term that allows them to step around landmines like "encryption backdoors" or "we're making encryption worse for everyone!" Her paper shows "responsible encryption" is anything but. And, even if implemented, it will result in far less access (and far more nefarious exploitation) than Rosenstein and Wray think.
The first thing the paper does is try to pin down exactly what it is these two officials want -- easier said than done because neither official has the technical chops to concisely describe their preferred solutions. Nor do they have any technical experts on board to help guide them to their envisioned solution. (The latter is easily explained by the fact that no expert on cryptography has ever promoted the idea that encryption can remain secure after drilling holes in it at the request of law enforcement.)
If you're going to respond to a terrible idea like "responsible encryption," you have to start somewhere. Pfefferkorn starts with an attempt to wrangle vague law enforcement official statements into a usable framework for a reality-based argument.
Rosenstein’s remarks focused more on data at rest than data in transit. For devices, he has not said whether his preferred legislation would cover a range of devices (such as laptop and desktop computers or Internet of Things-enabled appliances), or only smartphones, as in some recent state-level bills. His speeches also leave open whether his preferred legislation would include an exceptional-access mandate for data in transit. As some commentators have pointed out, his proposal is most coherent if read to be limited in scope to mobile device encryption and to exclude data in transit. This paper therefore makes the same assumption.
Wray, meanwhile, discussed both encrypted messaging and encrypted devices in his January 2018 speech. He mentioned “design[ing] devices that both provide data security and permit lawful access” and asked for “the ability to access the device once we’ve obtained a warrant.” Like Rosenstein, he did not specify whether his “responsible solution” would go beyond mobile devices. As to data in transit, he used a financial-sector messaging platform as a real-world example of what a “responsible solution” might look like. Similarly, though, he did not specify whether his “solution” would be restricted to only certain categories of data—for example, communications exchanged through messaging apps (e.g., iMessage, Signal, WhatsApp) but not web traffic (i.e., HTTPS). This paper assumes that Wray’s “solution” would, like Rosenstein’s, encompass encryption of mobile devices, and that it would also cover messaging apps, but not other forms of data in transit.
Either way, there's no one-size-fits-all approach. This is somewhat ironic given these officials' resistance to using other methods, like cellphone-cracking tools or approaching third parties for data and communications. According to the FBI (in particular), these solutions "don't scale." Well, neither do either of the approaches suggested by the Rosenstein and Wray, although Rosenstein limiting his arguments to data at rest on devices does suggest a somewhat more scalable approach.
The only concrete example given of how key escrow might work to access end-to-end encrypted communications is noted above: a messaging platform used for bank communications. An agreement reached with the New York state government altered the operation of the banking industry's "Symphony" messaging platform. Banks now hold encrypted communications for seven years but generate duplicate decryption keys which were held by independent parties (neither the banks nor the government). But this analogy doesn't apply as well as FBI Director Christopher Wray thinks it does.
That agreement was with the banks about changing their use of the platform, not with the developer about changing its design of the platform, which makes it a somewhat inapt example for illustrating how developers should behave “responsibly” when it comes to encryption.
Applied directly, it would be akin to asking cellphone owners to store a copy of a decryption key with an independent party in case law enforcement needed access to the contents of their phone. If several communication platform providers are also involved, then it becomes the generation of several duplicates. What this analogy does not suggest is what Wray and Rosenstein suggest: the duplication or development of decryption keys by manufacturers solely for the purpose of government access.
These officials think this solution scales. And it does. But scaling increases the possibility of the keys falling into the wrong hands, not to mention the increased abuse of law enforcement request portals by criminals to gain access to locked devices and accounts. As Pfefferkorn notes, these are problems Wray and Rosenstein have never addressed. Worse, they've never even admitted these problems exist.
What a quasi-escrow system would do is exponentially increase attack vectors for criminals and state-sponsored hacking. Implementing Rosenstein's suggestion would provide ample opportunities for misuse.
Rosenstein suggests that manufacturers could manage the exceptional-access decryption key the same way they manage the key used to sign software updates. However, that analogy does not hold up. The software update key is used relatively infrequently, by a small number of trusted individuals. Law enforcement’s unlocking demands would be far more frequent. The FBI alone supposedly has been unable to unlock around 7,800 encrypted devices in the space of the last fiscal year. State and local law enforcement agencies, plus those in other countries, up the tally further. There are thousands of local police departments in the United States, the largest of which already amass hundreds of locked smartphones in a year.
Wray's suggestion isn't any better. In fact, it's worse. His proposal (what there is of it) suggests it won't just be phone manufacturers providing key escrow but also any developer offering end-to-end encrypted communications. This vastly increases the number of key sources. In both cases, developers and manufacturers would need to take on more staff to handle law enforcement requests. This increases the number of people with access to keys, increasing the chances they'll be leaked, misused, or even sold.
The large number of law enforcement requests headed to key holders poses more problems. Bogus requests are going to start making their way into the request stream, potentially handing access to criminals or other bad actors. While this can be mitigated with hardware storage, the attack vectors remain open.
[A]n attacker could still subvert the controls around the key in order to submit encrypted data to the HSM [hardware security module] for decryption. This is tantamount to having possession of the key itself, without any need to attack the tamper-resistant HSM directly. One way for an attacker to get an HSM to apply the key to its encrypted data input is to make the attacker’s request appear legitimate by subverting the authentication process for exceptional-access demands.
These are just the problems a key escrow system would produce on the supply side. The demand for robust encryption won't go away. Criminals and non-criminals alike will seek out truly secure platforms and products, taking their business to vendors out of the US government's reach. At best, forced escrow will be a short-term solution with a whole bunch of collateral damage attached. Domestic businesses will lose sales and other businesses will be harmed as deliberately-introduced holes in encryption allow attackers to exfiltrate intellectual property, trade secrets, conduct industrial espionage, and engage in identity theft.
Wray and Rosenstein tout "responsible encryption." But their arguments are completely irresponsible. Neither has fully acknowledged how much collateral damage would result from their demands. They've both suggested the damage is acceptable even if there is only a minimal gain in law enforcement access. And they've both made it clear every negative consequence will be borne by device and service providers -- from the additional costs of compliance to the sales lost to competitors still offering uncompromised encryption. There's nothing "responsible" about their actions or their public statements, but they both believe they're 100% on the right side of the argument. They aren't and they've made it clear the wants and needs of US citizens will always be secondary to the wants and needs of law enforcement.