Law enforcement access to encrypted devices is possible, but it’s a terrible idea

It seems that this paper by renowned computer scientist Stefan Savage has been recently circulating on Capitol Hill. It proposes a system for allowing court-sanctioned access to encrypted devices by having manufacturers maintain a key escrow system. It’s similar to other proposals like Ray Ozzie’s “Clear” scheme.

Forgive me for being cynical, but I suspect that the argument that accompanies circulation of such a paper is, “See, despite what tech companies and cryptographers say, it is possible to design an encryption system that allows law enforcement to decrypt a device with a court order.” The problem with that argument is that it’s a straw man. As far as I can tell technologists don’t claim that it can’t be done, only that it can’t be done without compromising collective security.

Here is Facebook in 2018: “We have yet to hear of a technical solution to this challenge that would not risk weakening security for all users.” And here’s Apple:

Proposals that involve giving the keys to customers’ device data to anyone but the customer inject new and dangerous weaknesses into product security. Weakening security makes no sense when you consider that customers rely on our products to keep their personal information safe, run their businesses or even manage vital infrastructure like power grids and transportation systems.

“You can do it, sure, but it would create an unacceptable risk for users,” technologists say. And then I think they engage in a little strawmaning themselves sometimes by adding that the government claims there is no trade-off. I’ve read a bunch of official statements and speeches on the topic recently and I haven’t seen any that deny a trade-off outright, though they do come close. Whatever the case, Attorney General Barr addressed the issue of a trade-off head on in his speech last year on encryption:

Some [claim] that it is technologically impossible to provide lawful access without weakening security against unlawful access. But, in the world of cybersecurity, we do not deal in absolute guarantees but in relative risks. All systems fall short of optimality and have some residual risk of vulnerability a point which the tech community acknowledges when they propose that law enforcement can satisfy its requirements by exploiting vulnerabilities in their products. The real question is whether the residual risk of vulnerability resulting from incorporating a lawful access mechanism is materially greater than those already in the unmodified product. The Department does not believe this can be demonstrated.

Moreover, even if there was, in theory, a slight risk differential, its significance should not be judged solely by the extent to which it falls short of theoretical optimality. Particularly with respect to encryption marketed to consumers, the significance of the risk should be assessed based on its practical effect on consumer cybersecurity, as well as its relation to the net risks that offering the product poses for society. After all, we are not talking about protecting the Nation’s nuclear launch codes. Nor are we necessarily talking about the customized encryption used by large business enterprises to protect their operations. We are talking about consumer products and services such as messaging, smart phones, e-mail, and voice and data applications.

That’s a pretty condescending way to talk about citizens’ intimate private information, but at least he’s acknowledging the trade off. Key escrow can be done, sure, but doing it necessarily introduces some amount of risk to all users of a technology. If we can all agree on that proposition, then the relevant questions become:

  1. How much risk does that “some” encompass?
  2. Do the benefits of law enforcement access outweigh the costs of that risk?

The first question is mostly a technical one. Is there really just “a slight risk differential” as Barr suggests? I’m not a cryptographer, so on that score I’d recommend Matthew Green’s critique of key escrow systems like Clear and the one outlined in Savage’s paper, as well as Robert Graham’s take.

The second question is about what policy maximizes social welfare. Do benefits outweigh costs? To my mind there is one big item on the cost side of the ledger that will be hard to overcome: If device manufacturers set up such a system to comply with lawful orders from Western liberal states, they will end up complying if orders from illiberal states as well. As Apple and other tech firms are fond of saying, they comply with the law of the jurisdictions in which they operate. That means China and Russia and certainly our friendly NATO ally Turkey.

People around the world rely on encryption for their safety and freedom. Journalists, activists, persecuted minorities, and average citizens have legitimate reasons to keep secrets from their governments. The costs that one considers in evaluating a key-escrow program has to encompass the inevitable abuse by governments that rule over billions of people.

Another question is, How should we decide if the benefits outweigh the costs? Here’s Alan Z. Rozenshtein at Lawfare:

Even more importantly, this question implicates policy tradeoffs and value judgments that neither technology companies nor the information-security community have the necessary democratic legitimacy to make on their own. It’s neither up to Apple nor the Electronic Frontier Foundation (nor, for that matter, the FBI) to unilaterally decide how much information security is worth sacrificing to save a life or stop a crime; that’s a decision for the public, acting through its elected government, to make for itself. (Hence the ultimate need for a legislative solution to settle this debate one way or another.)

I agree, this is a question that should be answered by “the public,” but I don’t see why representative democracy is the only way it can speak. Indeed, it seems the public has been speaking pretty clearly on how it feels about encryption. From former Deputy Attorney General Rod Rosenstein’s big speech on encryption:

Technology companies operate in a highly competitive environment. Even companies that really want to help must consider the consequences. Competitors will always try to attract customers by promising stronger encryption.

That explains why the government’s efforts to engage with technology giants on encryption generally do not bear fruit. Company leaders may be willing to meet, but often they respond by criticizing the government and promising stronger encryption.

Of course they do. They are in the business of selling products and making money. …

Technology companies almost certainly will not develop responsible encryption if left to their own devices. Competition will fuel a mindset that leads them to produce products that are more and more impregnable.

He means all that to be an indictment, but it’s quite the opposite. He’s explaining that the public demands strong encryption and device manufacturers have every incentive to meet it.

This all betrays a gap between elites and the public. The public clearly wants unalloyed encryption, yet the elite response seems to be, “We know better and we’ll get to the right outcome with ‘democratic legitimacy’ even if we have to propose the same plan a dozen times. After all, we’re not talking about ‘the customized encryption used by large business enterprises to protect their operations,’ we’re merely talking about ‘consumer products and services.’”