Sorry, FBI: There's no way GovtOS will be contained to just one iPhone

Passcode-8

If you think the FBI’s case against Apple is a question of whether or not Apple wants to help prevent terrorism, you may be right, but probably not in the way you think.
In conversations with technology and forensic science experts, including one who may have inadvertently contributed to the FBI’s case against Apple, a loss for Apple in the FBI’s quest to force the company to develop software specifically for hacking the iPhone — which Apple calls "GovtOS" — is depicted as a potential win for bad actors across a wide spectrum.
The FBI, with the Department of Justice’s assistance, wants Apple to help them hack an Apple iPhone 5C that was used by a now-dead terrorist who, along with his wife, killed 14 people last year in San Bernardino, California.
No one disputes that someone (at the county level or even the FBI) reset phone’s iCloud password, making it impossible for anyone to unlock and access the phone without the passcode.
The software Apple is being asked to create would circumvent a number of security protections built into the iPhone
The software Apple is being asked to create would circumvent a number of security protections built into the iPhone and iOS, making it possible to try millions of potential passcode combinations in a matter of seconds without even touching the iPhone 5C. If it works, the FBI would be able to access whatever missing data Apple did not already willingly hand over. Apple’s rejection of this idea does not hinge on a technical limitation. Instead, as outlined in a Motion to Vacate Brief filed last week, it revolves around what Apples sees as the flaw in the government’s reasoning.

Apple’s case

Included in the brief is a lengthy declaration by Erik Neuenschwander, Apple’s Manager of User Privacy. In the filing, Neuenschwander, who has spent the last nine years at Apple and once taught programming at Stanford University, says the government is operating on “a flawed premise.”
The government’s papers suggest that once deployment of GovtOS is completed and the government (presumably) accesses the device, Apple can simply “destroy” GovtOS. The government suggests that this would reduce or eliminate any risk of misuse of the new operating system, including potential use on a device other than the device at issue here. I believe this to be a fundamentally flawed premise.
Put simply, Neuenschwander and Apple, according to the brief, believe it’s not so simple to destroy a digital creation. Especially because Apple, according to the brief, envisions a significant amount of testing and documentation coming out of this.
…quality assurance and security testing requires that the new operating system be tested and validated before being deployed. The quality assurance and security testing process requires that Apple test GovtOS internally on multiple devices with the exact same hardware features and operating system as the device at issue, in order to ensure that GovtOS functions as required by the government’s request.
Depending on how you read the brief, Apple comes across as either overly cautious or alarmist.
Once the process is created, it provides an avenue for criminals and foreign agents to access millions of iPhones. And once developed for our government, it is only a matter of time before foreign governments demand the same tool. Once the floodgates open, they cannot be closed, and the device security that Apple has worked so tirelessly to achieve will be unwound without so much as a congressional vote.

FBI’s limited interest

Since the start, the FBI has made it clear it’s only interested in this one iPhone 5C. On Feb 21, FBI Director James Comey sought to turn down some of the rhetoric:
The relief we seek is limited and its value increasingly obsolete because the technology continues to evolve. We simply want the chance, with a search warrant, to try to guess the terrorist’s passcode without the phone essentially self-destructing and without it taking a decade to guess correctly.
The government’s chief law enforcement agency, however, has sought Apple’s help in accessing 17 other locked phones since October. Experts believe there is virtually no way to contain the government’s latest request.
“This software cannot effectively be made unique,” said Jonathan Zdziarski, who spoke to me Monday morning.
Zdziarski should know what he’s talking about. He’s a sort of a reformed hacker who jailbroke the original iPhone and eventually segued into "white hat" work, developing forensic tools for various law-enforcement agencies. He’s also written a number of books on the topic, including iPhone Forensics: Recovering Evidence, Personal Data, and Corporate Assets.
Zdziarski told me he read the FBI warrant and Apple’s latest brief. What stood out to him about the FBI request was how familiar parts of it sounded. Many have noted the rather laudable technical detail in the FBI’s original Apple warrant. Zdziarski thinks he knows why: “It’s almost as if they took some of the pages out of my manual and put them in the court order.”
He pretty much dismisses the notion that the FBI can keep the GovtOS sacrosanct
He pretty much dismisses the notion that the FBI can keep the GovtOS sacrosanct, suggesting the idea that a unique ID can prevent the software from running on any other iPhone is fanciful, at best. “In my opinion, that protection is among the easiest to break,” said Zdziarski. He also expects GovtOS to spread to other law-enforcement and intelligence agencies. “Once it’s in the hands of law enforcement, experience tells us its use will broaden exponentially.”
Zdziarski paints a kind of nightmare scenario where everyone from the NSA to Chinese intelligence end up picking over the code, disassembling it and the combining it with other available exploits.

Going forward and going back

If Apple does build the GovtOS to spec, it will work on the iPhone 5C, but not with later phones (iPhone 5S and above) — at least, not without alteration.
Starting with the iPhone 5S and the A7 chip, all iPhones include a secure enclave (and a secure element, but that’s not the same thing — it handles tokenization for Apple Pay). The enclave is a coprocessor that works with the Touch ID fingerprint sensor and uses something called password entanglement, which weaves in your passcode with the device’s unique ID. That ensures the FBI can't just copy the iPhone disk onto a server somewhere and try to crack that; since the passcode is tied to the hardware, it will only work on the phone itself.
Apple or anyone else who has access to the GovtOS would have to alter the software in the secure enclave to make it work on newer devices. But the secure enclave is designed to be tamper-proof (it's right there in the name), so this is a seemingly impossible task, right?
“I wouldn’t say it’s a minor change,” Zdziarski cautioned, but, “I would say it’s a feasible change that Apple could make.”
However, Zdziarski also believes Apple could alter the secure enclave via a firmware update in such a way that it would be virtually impossible to use GovtOS on future phones. But even putting aside the newer devices, could GovtOS every really be limited to just one iPhone 5C?
Not exactly.

As Tim Cook noted last week, this code could make "hundreds of millions of customers vulnerable around the world." He may have referencing a more concrete number he revealed in January, when Cook announced that Apple had sold one billion active devices. It's probably safe to assume that a large percentage of those are older devices, like the iPhone 5C, iPhone 5, iPhone 4S and iPhone 4, which have chips that don't include the secure enclave. All of those phones would be vulnerable to Apple’s hack software if it got into the wild.
It’s a possibility Apple worries about openly in the brief:
Given the millions of iPhones in use and the value of the data on them, criminals, terrorists, and hackers will no doubt view the code as a major prize and can be expected to go to considerable lengths to steal it, risking the security, safety, and privacy of customers whose lives are chronicled on their phones.
For Zdziarski, it’s not a question of if, it’s when. He told me that law enforcement will see the GovtOS tool being used and validated in law enforcement.
“Eventually it will be ingested and used for any purpose law enforcement wants to use them for.”

Stop the terrorist

It’s that possibility, and the fact it could be used on so many existing phones (with a minimal, but not inconsequential, change to the ID), that convinces Zdziarski Apple is right, even in the face of terrorism concerns.
“If [people] are afraid of terrorism, then the last thing they want is for a half a billion phones to be compromised across the world,” he said and, without irony, pointed to the recent iCloud hacking scandal in which dozens of celebrity accounts were hacked and their personal photos and videos posted on Reddit. There was, he told me, also geotagging information in that data. With that information, a cyber stalker could easily become a real-life stalker or worse.
“If you care about terrorism and preventing it, it makes sense that you would want data to be secure on these devices,” added Zdziarski, who told me he doesn’t entirely understand the focus on this one phone at the expense of all others. “One out of a half a billion might have terrorist info on it,” he told me and said the rest appear to be owned by law-abiding citizens. Doing all this for that one phone “would actually make us a lot less safe.”
“I’m not a philosopher, but this is basic math,” he said.

Apple vs. the FBI explained, how this case might affect your iPhone

SHARE WITH FRIENDS AND COMMENT BELOW:


Read Also: 
------------------------------------------------------------------------------------

No comments:

Post a Comment