A long-delayed analysis of Microsoft’s most-recent earnings, which gives color to the company’s recent acquisition of Xamarin, and why the future is looking bright.
This Daily Update is about Stripe Atlas, but first a detour to discuss an interesting partnership between Condé Nast and Hearst. Then, and interview with Stripe founder and CEO Patrick Collison.
First some follow-up on Apple versus the FBI, then a discussion about how high-end Android is a distinct market, and how that impacts new phones from Xiaomi, Samsung, and LG. Finally, why Spotify’s move to Google makes sense.
Apple versus FBI is being framed as a debate between privacy and security. In fact, though, there is a powerful argument to be made that Apple’s position is the more secure one for the United States.
Bill Simmons’ new site looks to take an integrated approach to money-making. Plus, strategies for content creators with examples from Yahoo, BuzzFeed, and Vox
Apple clarified off-the-record that new phones would also be covered by the precedent of the FBI’s request, which explains why they are drawing a line in the sand. Said line, though, is not about encryption. That is important because encryption could be the ultimate victim.
Today’s update is a bit tardy for good reason: I spent quite a bit of time wrapping my head around what is happening with Apple and the FBI (most of today’s content is derived from the iOS Security white paper published by Apple); it’s a more complicated case than it appears at first glance and the implications are obviously significant.
On to the update:
Apple Versus the FBI
A federal judge ordered Apple Inc. to help the U.S. Justice Department unlock an iPhone used by one of the shooters in December’s terrorist attack in San Bernardino, California. Federal investigators haven’t been able to unlock the iPhone used by Syed Rizwan Farook, 28, who carried out a Dec. 2 shooting that killed 14 people at a holiday party, the government said in a filing in federal court in Riverside, California. U.S. Magistrate Judge Sheri Pym on Tuesday ordered Apple to provide “reasonable technical assistance” to the FBI to recover information from the phone…
Farook was using an iPhone 5c owned by the San Bernardino County Department of Public Health with an iOS 9 operating system. Farook and his wife, Tashfeen Malik, 29, were killed in a gun battle with police after the attack on his co-workers. The Justice Department wants Apple to provide customized software that will prevent the data on the phone from being deleted after 10 attempts to input the passcode. The software also must enable agents to send electronic passcodes to the phone, rather than manually typing them in, according to the application. The software will allow agents to automatically enter multiple passcodes to get around the encryption standards.
This is actually a pretty decent overview of the issue at hand (and much better than what I’ve seen at other outlets). Specifically, Apple is not being asked to break the encryption on the iPhone in question (which is impossible — more on this in a moment), but rather to disable the functionality that wipes the memory when multiple wrong passcodes are entered in a row.
Tim Cook responded with an open letter to customers that stated Apple’s unequivocal opposition to the judge’s order:
The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand…
We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
It’s important to be clear on the definition of backdoor: Cook is taking a broader one which says that any explicitly created means to circumvent security is a backdoor; the reason the distinction matters is that a more narrow definition that says a backdoor is a way to bypass encryption specifically would not apply here. In other words, Apple is not being asked to create a “secret key” for the data (which would be impossible after the fact) but rather to make it easier to brute force the passcode on the device.
Understanding iPhone Encryption
Let me back up a bit and explain what is going on here:
- Starting with iOS 8 all of the data on an iPhone is encrypted on disk with extremely strong encryption. The FBI could extract the data directly from the memory chips but it would take years to brute force the key (i.e. try every possible combination)
- In the case of the terrorist’s phone (an iPhone 5C) the key is generated from a combination of the user-created passcode and a key that is unique to the device (this key is embedded when the phone is manufactured). Thus, while a passcode is massively easier to brute force than the on-disk encryption, said brute forcing can only be done on the device itself
- To make it more difficult to brute force the passcode the operating system on the 5C gives users the option to wipe the device after 10 failed passcodes (a device is wiped by erasing the key that was created through the combination of passcode and device key; once it’s gone the data on the disk is encrypted forever); the operating system also enforced a 5-second delay between tries
Read this last point carefully: in the case of the iPhone 5C this additional security is provided by the operating system which means the operating system can be amended to remove it; however, only Apple can do so, because an iPhone will only accept an operating system signed using Apple’s master key.
Therefore, the judge has ordered Apple to build a custom version of the operating system using their signature that removes the 10-try limitation and the artificial delay between passcode entries (and adds a way to enter guesses via an external device, as opposed to having someone enter passcode guesses manually); this would allow the FBI to bruteforce the passcode and potentially gain access to the device. I say “potentially” because even with the five-second software limitation removed the 5C’s hardware needs 80-milliseconds to process each request, and we don’t know how long the terrorist’s passcode is: a 4-digit numeric passcode would only take 34 minutes to brute force, while an 8 digit alphanumeric passcode would still take over a million years.
I’ve been careful here to note that the phone in question is a 5C; the 5S featured the A7 chip which included a “secure enclave.” This is basically a completely independent computer with its own operating system that offers two important security upgrades over the 5C:
- Data on disk was now encrypted with a key that has three ingredients: the device-specific key and passcode plus a unique key generated by the secure enclave that is completely random and unknown to Apple. In other words, simply guessing the passcode doesn’t get you anywhere unless the secure enclave is cooperating.
- To that end, the secure enclave has its own timer that increases how long you have to wait between incorrect guesses: you get the first four for free, then you have to wait a minute, then five, then 15; once you’ve guessed wrong 9 or more times you have to wait an hour every time, which means even a 4-digit passcode would take over a year to brute force.
In other words, had the terrorists’ had an iPhone 5S or later the judge’s order would be moot: Apple could (correctly) counter that fulfilling the order was impossible because there is no software solution to the secure enclave’s enforcement of an entry delay.
The Risks for Apple and Encryption
There is a point to diving into these details: thanks the secure enclave an iPhone 5S or later, running iOS 8 or later, is basically impossible to break into, for Apple or anyone else. The only possible solution from the government’s perspective comes back to the more narrow definition of “backdoor” that I articulated above: a unique key baked into the disk encryption algorithm itself.
This solution is, frankly, unacceptable, and it’s not simply an issue of privacy: it’s one of security. A master key, contrary to conventional wisdom, is not guessable, but it can be stolen; worse, if it is stolen, no one would ever know. It would be a silent failure allowing whoever captured it to break into any device secured by the algorithm in question without those relying on it knowing anything was amiss. I can’t stress enough what a problem this is: World War II, especially in the Pacific, turned on this sort of silent cryptographic failure. And, given the sheer number of law enforcement officials that would want their hands on this key, it landing in the wrong hands would be a matter of when, not if.
This is why I’m just a tiny bit worried about Tim Cook drawing such a stark line in the sand with this case: the PR optics could not possibly be worse for Apple. It’s a case of domestic terrorism with a clear cut bad guy and a warrant that no one could object to, and Apple is capable of fulfilling the request. Would it perhaps be better to cooperate in this case secure in the knowledge that the loophole the FBI is exploiting (the software-based security measures) has already been closed, and then save the rhetorical gun powder for the inevitable request to insert the sort of narrow backdoor into the disk encryption itself I just described?
Then again, I can see the other side: a backdoor is a backdoor, and it is absolutely the case that the FBI is demanding Apple deliberately weaken security. Perhaps there is a slippery slope argument here, and I can respect the idea that government intrusion on security must be fought at every step. I just hope that this San Bernardino case doesn’t become a rallying cry for (helping to) break into not only an iPhone 5C but, in the long run, all iPhones.
UPDATE: Apple is now telling reporters that the same general technique requested by the FBI would apply to new phones.
By the way according to Apple it is not true that an iOS rewrite of the sort the FBI is asking for here wouldn't work on newer iPhones.
— Farhad Manjoo (@fmanjoo) February 17, 2016
Yes Secure Enclave would make it more difficult and would require a different workaround but Apple could technically get around it.
— Farhad Manjoo (@fmanjoo) February 17, 2016
This suggests that Apple can in fact update the secure enclave without wiping its keys. This does make sense, as it allows Apple to issue updates to the secure enclave with minimal hassle, but it obviously means the secure enclave’s protections are still rooted in software. I wouldn’t be surprised if the delay functionality in particular makes its way into the silicon itself, but that would only apply to future iPhones.
This fact also changes the political calculation: yes, the optics for this particular case are terrible, but if the precedent would be directly applicable than its hard to see what else Apple could do.
The Daily Update is intended for a single recipient, but occasional forwarding is totally fine! If you would like to order multiple subscriptions for your team with a group discount (minimum 5), please contact me directly.
Thanks for being a supporter, and have a great day!