How the DOJ’s call for Apple to unlock the Pensacola shooter’s iPhone is different from the San Bernardino case
Last week we had our team offsite retreat to plan for the coming year. One thing that came out of it was that we’re going to start a process to consider expanding the scope of the our policy work to include work on technologies adjacent to cryptocurrency.
A good example of that are the policy questions around encryption and encrypted messaging. Because cryptocurrency uses encryption, it bears pretty directly on it obviously. But it’s more than that. If law enforcement get what they want, which is the ability to legally-mandate backdoors into encrypted systems, that would easily include not just backdoors for private cryptocurrencies, but also perhaps the ability to stop or reverse transactions.
It seems that we’re on the right track, unfortunately. Lo and behold yesterday the Department of Justice called on Apple to help it break into an iPhone that belonged to the Pensacola Air Station shooter. Here’s the relevant part of Attorney General Barr’s statement yesterday:
The shooter possessed two Apple iPhones, seen on posters here.
Within one day of the shooting, the FBI sought and received court authorization based on probable cause to search both phones in an effort to run down all leads and figure out with whom the shooter was communicating. …
However, both phones are engineered to make it virtually impossible to unlock them without the password. It is very important to know with whom and about what the shooter was communicating before he died.
We have asked Apple for their help in unlocking the shooter’s iPhones. So far Apple has not given us any substantive assistance. This situation perfectly illustrates why it is critical that investigators be able to get access to digital evidence once they have obtained a court order based on probable cause. We call on Apple and other technology companies to help us find a solution so that we can better protect the lives of Americans and prevent future attacks.
The media has noted the similarities between this and the dispute between Apple and the FBI over the San Bernardino shooters’ iPhones. But what I find interesting here is the difference.
In the San Bernardino case, Apple was served with a court order directing it to write software to break its own phones:
As a result, the FBI asked Apple Inc. to create a new version of the phone’s iOS operating system that could be installed and run in the phone’s random access memory to disable certain security features that Apple refers to as “GovtOS”. Apple declined due to its policy which required it to never undermine the security features of its products. The FBI responded by successfully applying to a United States magistrate judge, Sheri Pym, to issue a court order, mandating Apple to create and provide the requested software.
Apple defied the court order and refused to write the software. Eventually the FBI backed down and the conventional narrative is that it did so because it found a security firm that helped it circumvent the phone’s security. This is how the New York Times characterized it just yesterday:
The San Bernardino dispute was resolved when the F.B.I. found a private company to bypass the iPhone’s encryption. Tensions between the two sides, however, remained, and Apple worked to ensure that neither the government nor private contractors could open its phones.
I think there was more to the FBI/DOJ’s decision than that. It is that shortly before they backed down Apple made it clear that if they went to court they would be employing a First Amendment defense:
A famous encryption case known as Bernstein v. US Department of Justice established long ago that code is speech and is protected by the First Amendment. Compelling Apple to write code would be the equivalent of the government compelling Apple’s speech. But that’s not the most important argument in this case. Instead, it’s the digital signature that Apple would use to sign that code that is the key to Apple’s First Amendment argument, say legal experts who spoke with WIRED.
“The human equivalent of the company signing code is basically saying, ‘We believe that this code is safe for you to run,’” says Jennifer Granick, director of civil liberties for the Center for Internet and Society at Stanford Law School. “So I think that when you force Apple to cryptographically sign the software, it has a communicative aspect to it that I think is compelled speech to force them to do it.”
That’s a fight that the DOJ doesn’t seem inclined to pick again, so they seem to now be limiting themselves to trying to influence public opinion; not go to court. But it’s a fight we have to be prepared to fight in court. If a developer can be forced to write code they don’t want to write (i.e. can be compelled to speak things they don’t want to speak), then it’s only a matter of time before cryptocurrency devs get court orders to build in backdoors. I think we probably have the upper hand at the moment, but we have to be prepared to make the case not just for privacy, but perhaps more importantly for free speech.