Yesterday, US Congress heard testimony on an important security case. The FBI wants help breaking into an iPhone. Apple, backed by other tech companies, is refusing. It sounds like an isolated issue iPhone users would care about, but the consequences of this fight will affect all of us.
What the FBI Is Asking For, and Why Apple Won't Do It
At the centre of this debate is a phone owned by Syed Rizwan Farook, the gunman in the San Bernandino mass shooting. The FBI has a warrant to search Farook's work phone (as well as permission from his employer, who owns the device). However, the phone is encrypted and protected with a passcode. Furthermore, Farook enabled a feature in iOS 7 that will wipe the phone if someone tries and fails to enter the passcode 10 times.
This puts the FBI in a tricky situation. They have the authority to search the phone but -- according to the FBI at least -- they don't have the ability. That's where Apple comes in. The FBI obtained a court order demanding that Apple create a version of iOS that would make it easier for the FBI to break into the phone. To be clear, the FBI is not asking Apple to break into the phone itself. Rather, the court order demands that Apple's update to iOS accomplish three things:
- Disable the feature that would erase the device after 10 incorrect passcodes are entered.
- Allow passcodes to be entered by a computer, rapidly enough to allow brute force attempts (as opposed to having to physically enter them via an on-screen keyboard).
- Remove the time delays between incorrect passcode attempts, so guesses can be entered without waiting.
At this point, it would then be up to the FBI to break into the phone using normal brute force methods. However, in an open letter to its customers, Apple contends that the FBI does not have the authority to ask for this. Furthermore, the company claims that this would set a dangerous precedent. It would effectively compromise the security of every iPhone in the world, and every smartphone by association, since Google and other companies would be subject to the same requirement if the FBI (or any other law enforcement agency) demanded it.
The FBI, on the other hand, bases its demand on the All Writs Act of 1789. To oversimplify a complex law, the act states that a court can compel a person or company to assist law enforcement in an investigation as long as the following criteria are met:
- The ordered party cannot be too far removed from the case.
- The government cannot impose an undue burden on the ordered party.
- The party's assistance must be necessary and there are no other judicial methods available.
- The action must already fall under law enforcement's jurisdiction, and must not incidentally create or expand jurisdiction.
Apple claims that the FBI's request would not just allow it to access Farook's iPhone (which it does have jurisdiction over), but that it would allow access to any iPhone (which the FBI does not have jurisdiction over). They say it's tantamount to building a backdoor, and as long as it exists, there'd be no way to prevent it from being used on every iPhone in the world by anyone who gets their hands on it. In other words, Apple can't unlock just this iPhone the way the FBI is requesting.
While this particular case is gaining a lot of attention, there are several similar cases that the FBI and Apple have disagreed over. In one case that involved the US government compelling Apple to make a "bypass device" for a drug dealer's iPhone, a judge ruled that "the constitutionality of such an interpretation [of the All Writs Act] is so doubtful as to render it impermissible as a matter of statutory construction". In other words, in that judge's view, it's extremely unlikely the FBI has the authority to make the request it's making.
This Case Isn't Just About Encryption, It's About Constitutional Rights
Privacy laws in the United States are generally a mess. Regardless of how controversial or messy the laws may be, tech companies are still obligated to comply with any legal request the government makes. However, Apple's on-by-default encryption means that it can't hand over the data stored on your iPhone even if it wanted to. Currently, there are no laws requiring that Apple be able to access encrypted data in order to turn it over to the government. That's why the FBI can't force Apple to break into Farook's iPhone itself.
That distinction is what makes this case so important. The FBI is not instructing Apple to turn over data it has access to (they can already do that) and it's not ordering Apple to break into the phone itself (which isn't legal). Instead, the FBI is ordering Apple to create entirely new tools to help it bypass the iPhone's security without Apple's help.
Apple's argument is that this violates multiple constitutional rights. For starters, in multiple cases, US federal courts ruled that code counts as protected speech under the First Amendment. Apple argues that this court order is effectively compelled speech -- meaning the government is forcing Apple to "say" something -- which is not generally legal except under very specific circumstances. Apple further contends that, due to the way an iOS update would have to be signed in order to patch the phone in question, the company would have to sign off on the software as authentic and approved by Apple. Even if the courts can compel Apple to create a tool to bypass it's own security measure, it can't compel Apple to state (via software signature) that the admittedly compromised code is up to the company's standards. In Apple's view, this is the government ordering the company to bear false speech.
Furthermore, Apple says that the order also violates its Fifth Amendment rights. Here, the company argues that the FBI is essentially conscripting Apple into becoming an agent for the government by requiring it to create tools it is not otherwise legally obligated to create. Apple claims that the order is an undue burden on the company. According to Apple's motion to vacate, this "violates Apple's substantive due process right to be free from 'arbitrary deprivation of [its] liberty by government'".
If the FBI were allowed to compel Apple to create this modified version of iOS, Apple argues it would set a dangerous precedent. The US government might then be able to compel other tech companies to create additional tools to bypass any form of security. This could essentially lead to tech companies being forced to create forensics labs for the purpose of undermining their own security features.
Even if you set aside the First and Fifth Amendment issues, the immediate consequence of the FBI's court order is that a tool would exist that undermines the security of every iPhone on Earth. This would violate everyone's Fourth Amendment rights that protect against unreasonable search and seizure. While the FBI has a warrant for Farook's iPhone, it doesn't have a warrant for every iPhone it could potentially unlock with the code it's asking Apple to develop. As Apple explains, once a backdoor is in the wild, you can't take it back. While the company wouldn't release said backdoor -- or even publish details on it -- publicly, there's no guarantee it or how it was made wouldn't get leaked, or its work duplicated. Apple also worries that if the US government can force Apple to hand over a tool like this, foreign governments -- ones with far worst privacy and human rights records -- will do so as well.
The FBI Says Encryption Would Make Smartphones Warrant-Proof
The FBI's counter argument is that if Apple has its way, warrants and court orders to search phones would effectively be meaningless. At a Congressional hearing on the topic yesterday (which you can watch in full here), FBI director James Comey said that the widespread adoption of consumer-level encryption would create "warrant-free spaces" where law enforcement is legally allowed to search, but unable to do so.
The primary retort to this argument is that encrypting a device doesn't make it impossible to access a device's data. It's just harder. In fact, the FBI was able to obtain one older backup of iCloud data from Farook's iPhone (and may have been able to access more if they hadn't reset his password). Apple's argument partially hinges on the idea that the FBI doesn't need any outside help.
However, New York County District Attorney Cyrus Vance acknowledged that this precedent isn't just about the FBI. Vance cited over two hundred phones that his office has jurisdiction over that may be affected by the decision. While the FBI may have the resources to pour into Farook's iPhone, New York County law enforcement likely can't devote that much time and effort into over two hundred phones. Instead, Vance argued in favour of a framework that would allow a court to authorise access to a device. However, it's unclear exactly how such a plan would be put into place.
The FBI paints a desperate picture for law enforcement. Indeed, even if the FBI could break into Farook's iPhone without Apple's help, there are likely to be plenty of other cases where local law enforcement have the legal right to search a phone, but not the ability. However, the FBI has not addressed the issues around ordering a company to let them into an encrypted device. In fact, Director Comey even admitted during yesterday's hearings that he had not considered that China may similarly order Apple to turn over the same code the FBI is demanding.
Nearly everyone involved is OK with the FBI breaking into Farook's iPhone. However, the bigger implications here will have a huge impact on security in the future. For now, there's not much you can do about this. You can however keep an eye on how the US presidential candidates respond to privacy-related issues. This is a big election year in the US, and this dispute could lead to new laws down the road. Depending on who gets elected, their interpretation of privacy law will make a big difference, and how elected officials view security issues will have a big impact on how cases like this play out in the future.