APPLE

Why The Apple-FBI Fight Could Lead To A Tech Industry Nightmare

The FBI's demand that Apple circumvent its own security measures could end with the government finally getting the "backdoor" it's wanted for decades

APPLE
(Photo: Getty)
Feb 17, 2016 at 7:55 PM ET

Senator Dianne Feinstein, a California Democrat who’s the vice chair of the Senate Intelligence Committee, took the fight over Apple’s refusal to help the government unlock an iPhone recovered from one of the San Bernardino terrorists to a new level Wednesday. If Apple doesn’t cooperate, she said in an interview with PBS NewsHour, Congress would likely move forward on legislation to create a tech industry nightmare: a “backdoor,” or a security hole purposely introduced to software so that police can get access to data. 

Feinstein’s argument echoes another encryption battle that played out over 20 years ago. In the early 1990s, responding to the growing use of encryption that was unbreakable at the time, the U.S. government developed and tried to promote a cryptographic chipset called the Clipper Chip for use in everyone’s phones. The chip was designed to allow law enforcement to eavesdrop on encrypted phone calls when needed. Activists, however, characterized the feature as a backdoor, and argued that anyone with knowledge of the hole, law enforcement or hacker, could potentially exploit it as well. In response to the backlash, the chip was abandoned after three years.

In this case, the FBI sought the court order to compel Apple to provide technical assistance in accessing data from an iPhone 5c used by one of the shooters in December’s San Bernardino terrorist attack—an order made necessary by recent changes to Apple’s software, which have made it harder for law enforcement to access iPhone data than in the past.

The FBI is asking Apple to create new software to enable them to crack the phone’s password, thus allowing them to access data on the phone. But although though the FBI maintains that Apple’s circumvention tool would only be used on this one phone, cryptographers have argued in the past that the mere act of creating such software—and undermining the security precautions of the iPhone to do so—could open the door for future abuse.

“In the wrong hands, this software—which does not exist today—would have the potential to unlock any iPhone in someone’s physical possession,” wrote Apple CEO Tim Cook in a letter to customers posted on the company’s website. Much like the case of the Clipper Chip, Cook described the order as a request for a backdoor.

Cook said Apple plans to oppose the order, characterizing the implications of the government’s demands as “chilling.” The White House reiterated Wednesday afternoon that the Department of Justice is asking Apple to create a new backdoor, and merely for access to one device.

Since at least 2009, Apple has allegedly complied with requests from law enforcement for help accessing data on its devices, and an Apple spokesperson told The Wall Street Journal in 2012 that, when faced with a court order, the company retrieves the data itself. However, because of both software and hardware updates in recent years, Apple cannot easily access data of interest to police.

Today, personal data on an iPhone is encrypted, and protected by a user’s PIN or passcode. Prior to iOS 8, released in September 2014, data such as text messages, photos, videos, contacts, audio recordings and call history were not encrypted, and could be made available to law enforcement in response to a request. “For all devices running iOS 8 and later versions, Apple will not perform iOS data extractions in response to government search warrants because the files to be extracted are protected by an encryption key that is tied to the user’s passcode, which Apple does not possess,” Apple’s page on government information requests reads.

But if an attacker—or law enforcement agent—can obtain or guess a user’s PIN, they can decrypt the data on the device simply by inputting the passcode. This is where the FBI needs Apple’s help. Since Apple doesn’t have the iPhone’s passcode, the FBI wants Apple to make the passcode easier for police to guess.

“I think we’re really well into the territory of talking about Apple being forced to hack its own devices. If you read the order, they use a lot of language that comes out of the forensics world. And it’s surprising how detailed that order is,” Jonathan Zdziarski, an iOS forensics and security researcher, told Vocativ. “On a technical level, it’s no secret that this is order forcing Apple to remove certain security mechanisms, and not only that, but to provide a tool that does this.”

All iPhones include protections to prevent an attacker—in this case, law enforcement—from guessing, or “brute forcing,” PIN combinations in rapid succession. The more incorrect passwords entered, for example, the longer the delay before the device allows the person inputting the code to try again. In some cases, entering incorrect passwords 10 times will trigger the phone to wipe all its data. The FBI wants to disable these features so it can obtain the passcode by guessing at it without being locked out.

In the iPhone 5s and later iPhone models, a chip called the Secure Enclave Processor, or SEP, keeps track of incorrect password attempts. The idea is that, by employing a separate chip that exists outside of the iPhone’s operating system to help handle security, it is less vulnerable to attack. There’s just one problem, from Apple’s perspective: the iPhone 5c doesn’t have an SEP. In other words, it handles password attempts in the iPhone’s operating system, which according to security research company Trail of Bits, makes the device technically vulnerable to the modifications the FBI wants.

It is unclear, however, whether the SEP in later iPhones can fully prevent the sort of modifications the FBI has requested. John Kelly, a former Senior Embedded Systems Engineer at Apple who worked on the SEP according to his profile on LinkedIn, wrote on Twitter that, if Apple was able to modify iOS for law enforcement in this instance, then it could modify the Secure Enclave’s firmware too.

An email to Kelly for further clarification had not been answered by the time of publication.

According to Cook and others, the FBI’s request, if upheld, would establish a dangerous precedent—that companies should be able to undermine the protective measures included in their products when asked, even if it puts the security of users at risk. Such was the nature of the US government’s Clipper Chip proposal, which imagined a standardized form of encryption that law enforcement could still have access to.

“This is an unprecedented, unwise, and unlawful move by the government,” Alex Abdo, a staff attorney at the American Civil Liberties Union, said in a statement. “The Constitution does not permit the government to force companies to hack into their customers’ devices. Apple is free to offer a phone that stores information securely, and it must remain so if consumers are to retain any control over their private data.”

Abdo further said that the move risked setting a “dangerous precedent” that could be leveraged by repressive regimes.

“I guess the real question is, ‘Can Apple defend itself against its biggest adversary, if their adversary was [itself]?’” Zdziarski said. “This is a crazy, impossible situation they’ve been put into.”