Opinion
Security and Privacy Law and Technology

No Easy Answers in the Fight Over iPhone Decryption

A look at the legal background and future possibilities for an issue that is likely to reoccur.
Posted
  1. Article
  2. Author
  3. Figures
empty chairs for U.S. House Committee hearing
Empty chairs reserved for Apple and the FBI in preparation for the U.S. House Committee on the Judiciary hearing in Washington, D.C., on Tuesday, March 1, 2016.

Perhaps the most significant law and technology story of 2016 thus far has been the fight between the FBI and Apple over whether Apple could be compelled to assist in decrypting the contents of an alleged terrorist’s iPhone. The San Bernardino case that captured the most attention ended without a judicial decision, because the FBI withdrew its request after a third party provided it with a means to obtain the data without Apple’s help. The underlying legal questions though had already been raised in other cases and will surely recur in future ones. In this column, I explain the legal issues in the case and why it should be neither impossible nor easy for the government to compel this sort of assistance.

This story began when the FBI came into possession of the iPhone used by one of the shooters in the attack at the Inland Regional Center in San Bernardino, CA, last December. The FBI wanted to examine the contents of the phone, but this particular phone, an iPhone 5c running iOS 9, stores data in encrypted form using a key derived from the user’s passcode. User passcodes can be as short as four numeric digits, but Apple designed the phone to resist brute-force attacks by inserting increasingly long delays between unsuccessful login attempts and, if enabled by the user, by wiping the phone of the necessary key information after 10 unsuccessful attempts.

On February 16, 2016, the FBI sought and obtained a court order compelling Apple to design and sign a system update, tied to this particular phone, which would disable the impediments to a brute-force attack. Apple refused to comply, challenging the legality of the court’s order. After a round of briefings, and on the eve of a court hearing, on March 28 the FBI withdrew its request after it obtained the assistance of an unknown third party in circumventing the iPhone security measures and acquiring the decrypted contents it was seeking.


It should be neither impossible nor easy for the government to compel this sort of assistance.


In its brief challenging the order, Apple made two types of arguments: constitutional and statutory. That is, Apple argued both that compelling its assistance would violate the Constitution and that such compulsion was beyond the powers that Congress had given to the federal courts. The distinction is important because while a ruling that the compulsion was beyond the court’s statutory power could be changed by Congress—and there might be substantial pressure on Congress to make such a change—a ruling that the compulsion violated the Constitution would be beyond Congress’s power to change.

Apple’s primary constitutional argument was that compelling its assistance would violate the First Amendment right to freedom of expression. The argument is seductively simple, almost syllogistic. Step one: courts have previously recognized computer code as a form of speech protected by the First Amendment. Step two: the Supreme Court has long held that the government can violate the First Amendment not only by restricting someone’s speech, but also by compelling someone to speak. Ergo, forcing Apple to write code that it did not want to write would be compelling it to speak, in violation of the First Amendment. QED.

The trouble with the argument is that the syllogism breaks down because the reasons for protecting computer code have nothing to do with the reasons for forbidding compelled speech. Computer code merits at least some First Amendment protection because it can be used to communicate. If I want to describe how an encryption algorithm works, for example, I might do so using code or pseudo-code. The code conveys ideas, and the government should not be able to stop me from explaining how an encryption algorithm works, at least not without a very good reason.

Compelling speech is problematic for an entirely different reason. Compelled speech does not stop anyone from saying or explaining anything. Nothing prevents a person from denying in the next breath what they were forced to say in the first one. The problem is that the very act of being forced to affirm a belief you do not hold can be a way to manipulate those beliefs and runs counter to the freedom to choose one’s own beliefs. The classic Supreme Court case striking down a speech compulsion was one involving schoolchildren compelled to say the Pledge of Allegiance.

The code that the government would force Apple to produce is not a set of beliefs, and producing code is not like affirming beliefs. For starters, it is not meaningful to talk about the beliefs of Apple, the company. Even if we look at actual Apple employees who would be directing or writing code, if they object, it is not because the code itself would be a misstatement of their beliefs. What they would be objecting to is being forced to do something that they don’t believe in. But the government forces people to do things they do not believe in all the time: to pay taxes, to serve on a jury, even to stop at a stop sign. None of those things even come close to raising a First Amendment issue. Neither should forcing Apple to write code.

The government’s access to information is also limited by the Fourth Amendment, which prohibits “unreasonable searches and seizures,” but one way to make a search or seizure reasonable is to obtain a valid warrant, which the government had in these circumstances. Thus, there was no viable Fourth Amendment argument in the case, and Apple did not make one.

Apple’s other arguments were statutory. In particular, Apple argued that the law the FBI and the court had invoked, the All Writs Act (AWA), did not in fact authorize the order at issue. The AWA, first passed in 1789, states quite simply, “The Supreme Court and all courts established by Act of Congress may issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.” That very general language leaves to the courts the task of sorting out what counts as “necessary or appropriate” and “agreeable to the usages and principles of law.”

It is well established that the AWA only applies to situations that Congress has not specifically addressed. Apple argued that Congress had addressed this issue, in its favor, in a law called the Communications Assistance for Law Enforcement Act (CALEA). CALEA has language saying that the government cannot mandate the design of telecommunications systems and cannot require telecommunications carriers to decrypt user-encrypted communications.

But as relevant to this case, Apple is not a telecommunications carrier. Nor would it want to be, because CALEA’s main effect is to mandate that telecommunications carriers be able to intercept the communications they carry. The point of CALEA was to ensure phone companies, and later VoIP and broadband companies, could intercept communications when presented with a lawful order to do so. In return, the government would not tell the phone companies how to choose among designs with the required capabilities, and would not make the carriers responsible for user encryption. Since Apple is not subject to CALEA’s mandate, there is no reason to think it can take advantage of CALEA’s limits either.

This means we are in a situation that Congress has not explicitly addressed. The extent to which the AWA can be used in such situations is quite uncertain, because there have not been many such cases over the years. In fact, there is essentially a single relevant Supreme Court case. In that case, decided in 1977, the Supreme Court determined that the New York Telephone Company could be compelled under the AWA to install what is known as a “pen register” on its telephone equipment in order to record the phone numbers dialed from a particular phone. In so doing, the Court talked about a number of different factors that supported its conclusion, but was not clear about whether any of the factors were strictly necessary, and how, if at all, the factors were supposed to be weighed against each other.


How much increased security risk is acceptable depends on what the countervailing benefits are.


Later cases have derived three factors from the language of the New York Telephone case. The first is whether the company is “a third party so far removed from the underlying controversy that its assistance could not be permissibly compelled.” Apple argued that it was “far removed” from the controversy, because while New York Telephone was asked to install something on its own equipment, Apple did not own the iPhone at issue. This cannot be right. Because a system update must be signed by Apple to be accepted by the iPhone, Apple is uniquely positioned to provide the assistance the FBI sought. Moreover, the assistance is necessary because of the design choices that Apple made. That isn’t necessarily reason enough to compel the assistance, but it does suggest the Supreme Court’s language about parties being far removed wasn’t meant to categorically exclude a company in Apple’s position.

The last two factors from the New York Telephone case are the burden on the company to provide the assistance and the necessity for the government to obtain the assistance. It is with respect to these factors that Apple’s arguments are most convincing, but in a way that depends on the facts of each individual case. The key burden here is not so much the effort of writing the code, for which Apple would be reimbursed, but the security risk that would be created as a result. Even if Apple tied the code it produced to this particular phone and only deployed it within the company, that internal development and deployment would still increase the risk that other similar iPhones could be broken if that knowledge were leaked, although the risk would be much smaller than if Apple were to produce general-purpose code that it delivered to the FBI.

How much increased security risk is acceptable depends on what the countervailing benefits are. That benefit varies depending on the nature of the case and the role of the particular device in the case, as well as what alternatives, if any, the government has. Lack of alternatives, standing alone, shouldn’t necessarily win the day. In these cases, the government consistently argues that lack of access to data means that crimes will go unsolved. But that is a trade-off we make throughout the rules that govern criminal procedure. Sometimes we let crimes go unsolved to serve other societal goals. Moreover, it may be helpful to ask what we’re comparing to when we say that crimes are going unsolved. Access to phone data is certainly better for law enforcement than no access. On the other hand, these phones collect and store data that 20 years ago simply would not have existed.

As for how the balance should have been struck in the San Bernardino case, I am actually somewhat ambivalent. Apple could have made a better case for what the security risk would have been, beyond just calling it a “backdoor,” while the government could have made a better case for its need, beyond just calling it a “national security” case. The important point is that when the next fight comes—and it will—neither side should be able to claim an easy win without a careful look at the facts.

Back to Top

Back to Top

Figures

UF1 Figure. Empty chairs reserved for Apple and the FBI in preparation for the U.S. House Committee on the Judiciary hearing in Washington, D.C., on Tuesday, March 1, 2016.

Back to top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More