Nov 24, 2016

Absolutely, and the commitment to security and privacy keeps me coming back to iOS. The iOS white paper[0] is an excellent read for anyone interested in security. Unlike android devies, iOS security is coupled to hardware HSM chips which Apple refers to the secure enclave. It's used not just to encrypt and decrypt, but to verify the entire boot process and all binaries used by the cellular chips.

Now that google is manufacturing devices I expect them to begin following Apple's example. Avoid garbage like TrustZone and do it right.

1. https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Nov 05, 2016

>We all want to prevent people hanging from cranes, but crypto alone does not equal privacy, does not keep you safe, especially not in those countries, where rubber hose cryptanalysis (https://xkcd.com/538/) is much more common. In those dangerous situations/countries, good operational security practices are better to avoid detection/suspicion than using any 'magic' crypto messaging app.

Then just put a TL;DR at the end of your article and say 'use iMessage/Facetime, it's probably good enough for operating in an unstable region' If it worked for Erdogan it will work for you!

EDIT: (While I intend to come off as cocky, I'm serious:

https://www.apple.com/business/docs/iOS_Security_Guide.pdf page 41

http://www.independent.co.uk/news/world/europe/turkey-coup-e...

)

Nov 03, 2016

It is true that iCloud has a large number of issues.

> Let me repeat that: you need to ERASE YOUR PHONE TO CHECK __IF__ A BACKUP EXISTS.

Maybe I don't understand what you're getting at here, but you can see if there is a backup of your device. Settings > iCloud > Storage > Manage Storage – Lists all the devices w/a backup, the last time they were backed up, the size of the backup, an estimate of the size of the next backup, etc.

> My Passbook/Wallet needs to be recreated every time I set up a device?

See "How Apple Pay uses the Secure Element" - https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Sep 20, 2016

I made the title match the URL by adding "5C" specifically. Ars sometimes changes titles around after publishing and in this case I think having "5C" specified is important as that appears to be the only model that was actually tested. In the arxiv paper, Section VI. "Future Work", the author writes:

>The iPhone 5c device being analyzed in this research project was far from the latest Apple phones. Since then several new models were introduced such as iPhone 5s, iPhone 6 and 6s, iPhone SE and iPhone 7. However, iPhone 5s and 6 use the same type of NAND Flash memory devices. It would be logical to test them against mirroring.

Which seems to me to show that only the 5C has been tried, and the 5C lacks the Secure Enclave. Somewhat to my surprise the SE isn't mentioned in the paper at all, so I'm not sure this actually is applicable to later model iPhones as the author assets based purely on NAND type. The replay counter is stored within the SE itself [1], so mirroring the Flash should be useless in terms of gaining additional manual input attempts, and thus of rapidly diminishing importance as older iPhones cycle out of working use.

If this applies only to older devices it's still worth a bit of notice though as it contradicts what the FBI said earlier in the year, and as there are plenty of older devices still around. The iPhone 5C itself was only completely discontinued worldwide this past February (in India, discontinued elsewhere Sept 2015), and iPads tend to be held onto longer then iPhones, so anyone using those in a situation where they may face significant threat of physical attack should keep in mind that they should use an alphanumeric full passcode, even though since the 5C lacks Touch ID it's less convenient and can't cover the same gamut of threat profiles.

1: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Sep 15, 2016

They are wrong. The A7 added a hardware passcode attempt counter that would defeat their method.[1]

And that's not all. With the introduction of Touch ID, Apple has shifted to 6 digit passcodes as the standard. The authors note that their method would not work so well, even if they had infinite time:

"Given six attempts per each rewrite this method would require at most 1667 rewrites to find a 4-digit passcode. For a 6-digit passcode it would require over 160 thousand rewrites and will very likely damage the Flash memory storage."

If you want to decrypt the modern iPhone you'd probably have to try to poke inside of the secure enclave with an SEM or something. And then you're getting into the realm of hardware defenses against this kind of intrusion, like physical self-destruction when probed.. not sure if Apple's doing anything there but it's crazy stuff.

[1] https://www.apple.com/business/docs/iOS_Security_Guide.pdf (page 12)

Sep 15, 2016

The iOS security guide would appear to contradict this:

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

> On devices with an A7 or later A-series processor, the delays are enforced by the Secure Enclave. If the device is restarted during a timed delay, the delay is still enforced, with the timer starting over for the current period.

This is not super specific, but would imply that the secure enclave has its own storage.

Sep 15, 2016

From what Apple's released on how iPhone security works [1], it sounds like such keys are still written to external flash, just in a much more low-level way. So there may be a theoretical way to do this attack on a more recent iphone, but you'd have to do a lot more reverse engineering to figure out a few layers of undocumented proprietary protocols.

[1] https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Aug 09, 2016

They do care, even if they don't fully grasp what details they should care about.

Since we're often that "tech friend", we should know what we're talking about:

iOS Security white paper: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Apple's privacy pages: http://www.apple.com/privacy/

How iOS Security Really Works: video of a session from WWDC 2016 — https://developer.apple.com/videos/play/wwdc2016/705/

Jun 15, 2016

Alternatively you can also RTFM before complaining about secrecy... https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Of course that will never be as audit friendly as an Open-Source code. But don't call it a secret, while you actually just didn't search for the information...

Jun 02, 2016

Apple discusses who it uses for cloud storage when discussing iCloud in their security guide (valid as of May 2016):

> The encrypted chunks of the file are stored, without any user-identifying information, using third-party storage services, such as Amazon S3 and Windows Azure.

[0]: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

May 30, 2016

Repeated from above: The iPhone 5C predated the introduction of the secure enclave. The 5C contains an A6 chip; according to page 7 of https://www.apple.com/business/docs/iOS_Security_Guide.pdf, the secure enclave is only available with A7-based devices or newer.

May 30, 2016

The iPhone 5C predated the introduction of the secure enclave. The 5C contains an A6 chip; according to page 7 of https://www.apple.com/business/docs/iOS_Security_Guide.pdf, the secure enclave is only available with A7-based devices or newer.

May 02, 2016

Everything you ever wanted to know about iOS security (and a few things you didn't know you wanted to know): https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Apr 14, 2016

It's not really about hacking Touch ID. Apple has published a really thorough whitepaper[0] on the security of Touch ID and the secure enclave. I don't really think that hacking the Secure Enclave to extract fingerprints is even possible.

The problem I see is using fingerprints which are unique to your person, unchangeable, and spread around us in a very liberal fashion as passwords.

Imagine for a second that the San Bernardino iPhone had used Touch ID, don't you find it highly plausible that the US government would be able to find a good fingerprint that could be used to unlock the phone? I guess they even had his body at hand so it would have been dead simple.

0: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Apr 06, 2016

Refer to the iOS Security Guide [1], starting on page 10.

1. https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Mar 30, 2016

Good (excellent) article on IOS security: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

See Page 12 on advantage of A7 or higher processor, but, in particular:

"On devices with an A7 or later A-series processor, the delays are enforced by the Secure Enclave. If the device is restarted during a timed delay, the delay is still enforced, with the timer starting over for the current period. "

The question still outstanding, is whether the Secure Enclave can be modified, and also, whether it can be modified without a passcode.

Mar 30, 2016

The Secure Enclave which amongst others prevents tampering with the microkernel and related modules of iOS was introduced with the A7 SoC.

So basically only the iPhone 5 and older models are easy to compromise, with iPhone 5S and newer it gets a _lot_ harder.

Details here, from the mothership itselves: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Mar 29, 2016

Interesting. What's your source? Apple's whitepaper suggests otherwise, to my reading:

"The device’s unique ID (UID) and a device group ID (GID) are AES 256-bit keys fused (UID) or compiled (GID) into the application processor and Secure Enclave during manufacturing."[1]

What this says to me is that while rewritable data storage is indeed kept in regular commodity flash memory chips, it's all encrypted by a unique device-specific key that is somehow burned into the secure enclave. So that one little secret kept inside the enclave would allow it to store everything else off-chip.

[1] https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Mar 28, 2016

I'm just using the term casually. Technically I think it's a hierarchy of keys wrapping keys. It's all explained here: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Mar 28, 2016

Most likely it's already been fixed.

The iPhone 5C in question uses an A6 processor. It encrypts data by comingling the passcode with the unique device ID to create a a strong 256-bit key, so you can't just pull and brute force the flash memory chip. Meanwhile the OS will wipe the key if you guess a wrong passcode too many times, making the data forever inaccessible.

However there is one vulnerability with the A6 that some have theorized.[1] If you could somehow get around the wiping part, you could keep guessing passcodes. A typical 4 or 6 digit passcode could be guessed in under a day. So it may be possible to copy the flash memory into a soldered-in test rig that is effectively wipe-proof. It would restore the contents every time it's wiped. So that's the best guess I know of for what happened here.

But starting with the A7 Apple added the secure enclave. This now enforces at the hardware level an escalating time delay with each wrong passcode guess. It goes all the way up to a one hour delay.[2] That's also where the (unreadable) unique device ID resides, so there's no swapping out the processor with a rig. The key is forever wedded to this protection against brute forcing.

That is pretty darn spiffy from a security standpoint. If it works as designed, about the only hope anyone has of getting at data from an A7 or later device is through iCloud backups.[3]

[1] https://www.aclu.org/blog/free-future/one-fbis-major-claims-... see also http://blog.cryptographyengineering.com/2014/10/why-cant-app...

[2] https://www.apple.com/business/docs/iOS_Security_Guide.pdf

[3] Either that or some kind of peek into the secure enclave. It's specifically designed to inhibit this at a hardware level, but perhaps a nation-state could figure it out (e.g. verrrrry carefully grinding it down without destroying it and looking at state with an electron microscope).

Mar 22, 2016

This is plainly wrong the device is fully encrypted RTFM https://www.apple.com/business/docs/iOS_Security_Guide.pdf

PS: But yeah the optional iCloud backup is currently the weak spot.

Mar 21, 2016

While the legal maneuvering is interesting, I'd like to talk more about the technical mechanisms. Is it actually possible?

Snowden said the FBI is full of shit[1] and of course the phone is hackable, citing an ACLU report.[2] This report states that one could "easily" bypass the auto-erase-after-10-attempts function by popping out the Flash memory chip, copying its contents into some sort of test rig wired in its place, and then restoring it whenever it gets erased.

This is an interesting modification of an attack scenario laid out in an excellent review of iPhone/iOS8 security by Matthew Green:

"Since only the device itself knows UID -- and the UID can't be removed from the Secure Enclave -- this means all password cracking attempts have to run on the device itself. That rules out the use of FPGA or ASICs to crack passwords. Of course Apple could write a custom firmware that attempts to crack the keys on the device but even in the best case such cracking could be pretty time consuming, thanks to the 80ms PBKDF2 timing."[3]

What this theoretical rig changes is it essentially allows a custom chip to run on the device (namely a delete-proof Flash chip), bypassing the need for Apple to write custom firmware. So a typical 6 digit one would take under a day to crack, based on the 80ms cost per attempt.

So, it does seem possible to crack the pre-A7 phone in question with this rig.

However, and here is where it gets interesting, Apple has said conflicting things about current phones. One the one hand, ever since the A7 they've added a hardware-level escalating time delay between failed passcode attempts:

"On devices with an A7 or later A-series processor, the delays are enforced by the Secure Enclave. If the device is restarted during a timed delay, the delay is still enforced, with the timer starting over for the current period."[4]

This would in theory make it infeasible to attempt this kind of rig on a current iPhone. Even a typical weak passcode would encounter an hour-long delay at least once every 10 attempts. It could take years to bruteforce all but the most predictable passcodes.

However, Apple has also said that "Yes, it is certainly possible to create an entirely new operating system to undermine our security features as the government wants."[5] This would seem to suggest that software alone could enable bruteforcing, and this implication is in stark contrast to the statement on hardware defenses within the secure enclave. (Did they mean possible only on pre-A7 phones? It sure feels like they feel there's more at stake than that.)

So I don't know what to believe at this point. The ACLU seems wrong in suggesting that this particular rig would work on anything but old pre-A7 iPhones, based on the current secure enclave's time delay. But Apple has outright stated that GovtOS could enable the cracking of iPhones. So... how?

[1] https://twitter.com/Snowden/status/707299113449230336

[2] https://www.aclu.org/blog/free-future/one-fbis-major-claims-...

[3] http://blog.cryptographyengineering.com/2014/10/why-cant-app...

[4] https://www.apple.com/business/docs/iOS_Security_Guide.pdf

[5] http://www.apple.com/customer-letter/answers/

Mar 21, 2016

Your failure to research the feelings behind this statement is obvious. Fingerprints on an iOS device are stored in the Secure Enclave, a hardware module on the device itself, used for checking fingerprints, answering yes or no to the question "is this the user's fingerprint?"

As for Apple Pay, your card details are used once -- exactly once -- to generate an obfuscated alias card identifier, again, only useable by the device.

There is no database. Fingerprint data doesn't leave the phone. No one stores your credit card details.

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Research before writing.

Mar 16, 2016

"Each file is broken into chunks and encrypted by iCloud using AES-128 and a key derived from each chunk’s contents that utilizes SHA-256. The keys, and the file’s metadata, are stored by Apple in the user’s iCloud account. The encrypted chunks of the file are stored, without any user-identifying information, using third-party storage services, such as Amazon S3 and Windows Azure." (https://www.apple.com/business/docs/iOS_Security_Guide.pdf)

Although your IP address and some other connection metadata will be known to Google.

Mar 12, 2016

Lots of info here:

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

The short version is that your passcode (whether four digits, six digits, or a full password) is combined with an encryption key embedded in the device in a way that's supposed to be impossible to extract, and used to derive the encryption key used to protect your data.

The device you linked to relies on a vulnerability in the US, where it would report that a passcode entry failed before recording that failure to nonvolatile storage. Normally, the device starts adding more and more delays to passcode entry after a few failures. By cutting power to the device immediately after it reported failure, it bypasses those escalating delays. As your link mentions, Apple fixed this vulnerability in a subsequent OS update, so that hardware only works on older OSes. This phone's OS is too new.

Mar 12, 2016

Apple's description is here https://www.apple.com/business/docs/iOS_Security_Guide.pdf, and you can find varying quality of discussion in the media by searching for Secure Enclave.

Basically, the Secure Enclave contains a 256-bit AES key physically fused into the silicon during the chip fabrication process. Apple don't know this key, and neither do the manufacturers. It's different on every iPhone. The key cannot be read by any software, or the OS, or even firmware. All that can be seen is the result of using it in a crypto operation.

The key used for actual encryption on iOS is derived by taking an intermediate key derived from the PIN, and then entangling it with the Secure Enclave key (and, I believe, the CPU's key, which is also unique and fused into the hardware, but not quite so secretive). This effectively ties the crypto process to the phone - if you take a data dump of storage and try to brute force it on some more powerful kit, cracking the PIN isn't enough. You'll also have to crack both the AES keys.

This isn't universal across all iPhones - I think the 5S onwards have it.

Mar 08, 2016

For more on this see "File Data Protection" and "Data Protection classes" in https://www.apple.com/business/docs/iOS_Security_Guide.pdf .

The default protection class is "Protected Until First User Authentication". This means that unless an app says something more specific, the key required to read a file is not available between reboot and the first time a phone is unlocked.

Mar 08, 2016

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

> Every iOS device has a dedicated AES 256 crypto engine built into the DMA path between the flash storage and main system memory, making file encryption highly efficient.

> The device’s unique ID (UID) and a device group ID (GID) are AES 256-bit keys fused (UID) or compiled (GID) into the application processor and Secure Enclave during manufacturing. No software or firmware can read them directly; they can see only the results of encryption or decryption operations performed by dedicated AES engines implemented in silicon using the UID or GID as a key.

> Additionally, the Secure Enclave’s UID and GID can only be used by the AES engine dedicated to the Secure Enclave. The UIDs are unique to each device and are not recorded by Apple or any of its suppliers.

Mar 04, 2016

There are more details on the HSM in the iOS security guide here. https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Mar 03, 2016

> It's also what they've been saying about iPhones for the last few years, even though it's now been shown to be untrue.

Well, the model of phone in that case (5C) is 2.5 years old using a 3.5 year old version of their CPUs lacking some of the modern hardware necessary (Secure Enclave.)

High-level details here: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 29, 2016

This assumption is not correct. The (4-digit) passcode is entangled with the device's unique ID, which is burned into the A6 application processor (on the other side of the logic board). It's theoretically possible to extract the UID from the A6 at the silicon level, but not realistically possible.

I suggest you read the iOS Security Guide: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 29, 2016

The author is absolutely, totally wrong.

The PIN number is entangled with the device's 256-bit "UID", which itself is on-die in the SoC/CPU and NOT extractable without either being able to run code on the CPU, or decapping the SoC, reverse engineering its implementation, and extracting the UID, all from the SEM imagery.

The PIN number and the UID are fed to key derivation code for strengthening; the result of that process is used to actually perform encryption of the data on the NAND.

The weak point here is the PIN number; the FBI would be extremely hard-pressed to brute force the derived AES keys.

This is described in the "Tangling" definition on page 59 of the iOS Security Guide: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

I flagged the article, as the entire argument is predicated on a factually false premise.

Feb 25, 2016

Apple is required to have backdoors, at least on iPhones sold in foreign countries, isn't it?

I don't believe this is the case.

Even if the SE were completely secure, a rogue update of iOS could intercept the fingerprint or passcode whenever it is typed, and replay it to unlock the SE when spies ask for it. As far as I know, the on-screen keyboard is controlled by software which isn't in the SE.

What you say about an on-screen passcode is likely true but the architecture of the secure enclave is such that the touch ID sensor is communicating over an encrypted serial bus directly with the SE and not iOS itself. It assumes that the iOS image is not trustworthy.

From the white paper [1]:

It provides all cryptographic operations for Data Protection key management and maintains the integrity of Data Protection even if the kernel has been compromised.

...

The Secure Enclave is responsible for processing fingerprint data from the Touch ID sensor, determining if there is a match against registered fingerprints, and then enabling access or purchases on behalf of the user. Communication between the processor and the Touch ID sensor takes place over a serial peripheral interface bus. The processor forwards the data to the Secure Enclave but cannot read it. It’s encrypted and authenticated with a session key that is negotiated using the device’s shared key that is provisioned for the Touch ID sensor and the Secure Enclave. The session key exchange uses AES key wrapping with both sides providing a random key that establishes the session key and uses AES-CCM transport encryption.

[1]: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 25, 2016

>it's RAM is on chip (not shared with it the main CPU, and probably has ECC)

Apple's security guide would indicate otherwise, look on page 7. The secure enclave encrypts its portion of memory, but it isn't built into the secure enclave itself.

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 24, 2016

Short answer: yes.

Longer answer: There's a key that encrypts the actual data, and that key is stored on disk, but encrypted with your passcode along with a hardware key. The hardware key cannot be read, only used to decrypt. Changing your code just changes the key stored to disk, but not the encryption key, so it's quick, but preserves security.

Longest and most accurate answer: https://www.apple.com/business/docs/iOS_Security_Guide.pdf from page 10.

Feb 22, 2016

I think the confusion stems from the iOS security guide that Apple published. Page 7 of the guide states that "The Secure Enclave is a coprocessor fabricated in the Apple A7 or later A-series processor. It utilizes its own secure boot and personalized software update separate from the application processor," which implies that somehow updating it is more secure, without saying exactly how much control Apple has over updating it, and whether or not the phone needs to be unlocked before it accepts new firmware. Given that they haven't come out and said that they can't override the firmware for locked phones, I'd say they can. Although, before Apple's recent statements I would have assumed that they couldn't, so the confusions understandable. The guides at https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 22, 2016

I am not a security expert, but my understanding is that the main problems for the FBI are: a) There is a timed delay for trying new passwords after enough unsuccessful attempts and b) The phone will wipe itself after 10 unsuccessful attempts.

It's pretty much only sufficient to have (a) since the delays will make it take years to guess the password by brute force.

I just ctrl-f'd for "delay" in the security guidelines[0], which claim that the secure enclave is the one that enforces the timed delay[1], so I guess the only attack vector would be if you could somehow backdoor code onto the chip. I can't find anything in the guide from a quick skim, but I'd suspect the code is on a ROM chip or is somehow prevented from an upgrade without an unlock?

[0] https://www.apple.com/business/docs/iOS_Security_Guide.pdf [1] Page 12

Feb 18, 2016

This appears to be true. If you look at Page 5 of [1], there's a side note about DFU mode. There's no indication that updating the firmware wipes the secure contents of the device, which it would need to in order to be secure:

""" Entering Device Firmware Upgrade (DFU) mode

Restoring a device after it enters DFU mode returns it to known good state with the certainty that only unmodified Apple-signed code is present. DFU mode can be entered manually: First connect the device to a computer using a USB cable, then hold down both the Home and Sleep/Wake buttons. After 8 seconds, release the Sleep/Wake button while continuing to hold down the Home button. Note: Nothing will be displayed on the screen when the device is in DFU mode. If the Apple logo appears, the Sleep/Wake button was held down too long. """

[1] https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 18, 2016

Assuming you are referring to the San Bernardino case, that is an iPhone 5c which features an A6 series CPU. The Secure Enclave feature was introduced with the A7 [1].

[1] https://www.apple.com/business/docs/iOS_Security_Guide.pdf (page 5)

Feb 18, 2016

Unfortunately this writer does not understand the boot loader security on iOS. Since Apple's signing key is burned into the device, Apple is the only one who can modify its functionality by loading new firmware. Even in that case, the only firmware that can be modified without unlocking the device is the lowest level boot loader, which likely has numerous size and functionality limitations, as all the device features probably are not enabled. I would not deem this a flaw in device design. If Apple creates a signed piece of software that allows for a brute force exploit, it is creating a backdoor for the FBI.

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 17, 2016

> That is completely not true. There is no way to make such a thing that can only work on one particular phone

The technique that makes this possible is described in Apple's iOS Security White paper, page 6 ("System Software Authorization"): https://www.apple.com/business/docs/iOS_Security_Guide.pdf

This mechanism explains why you can't take an old release of iOS off a different phone and copy it to yours.

Feb 17, 2016

It doesn't quite work that way. The files on the device are ultimately encrypted with a dependency on both the hardware key in the Secure Enclave and the user's passcode. It's impossible to update the firmware without both pieces of information or erasing the device, and on A7 or later processors the unlock attempt delay is a direct result of the method of encryption used and tied to the hardware so it must be performed on the device itself.

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 17, 2016

In reading the IOS security guide, it's not clear to me that the device GID is actually left unrecorded. See here:

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

> The UIDs are unique to each device and are not recorded by Apple or any of its suppliers. The GIDs are common to all processors in a class of devices (for example, all devices using the Apple A8 processor), and are used for non security-critical tasks such as when delivering system software during installation and restore.

The 'not recorded' explicitly refers only to UID, not GID. This means that in theory the GID is accessible and knowable to/by Apple. With this information, it should be possible to use a different processor in conjunction with the secure enclave that spoofs the correct GID.

Correct me if i'm wrong, but isn't this sufficient to bypass the time-delay and thereby unlock the phone?

Feb 17, 2016

and yet it would be less interesting to consider if the password was a "six-character alphanumeric passcode with lowercase letters and numbers" because even if the software rate-limiting was disabled with a rogue firmware update, the PBKDF2 or similar iteration count makes brute-forcing impractical.

> A large iteration count is used to make each attempt slower. The iteration count is calibrated so that one attempt takes approximately 80 milliseconds. This means it would take more than 51⁄2 years to try all combinations of a six-character alphanumeric passcode with lowercase letters and numbers

(Page 12 of https://www.apple.com/business/docs/iOS_Security_Guide.pdf).

Feb 17, 2016

> The best Apple could do is sign a malicious update to the Secure Enclave firmware that either removes the time delays or dumps the keys.

Dumping the Secure Enclave would not result in the keys necessary to read the files on the filesystem. Each file has a unique key, which is wrapped by a class key, and for some classes, the class key is wrapped by a key derived from the passcode. If you don't have the passcode, you can't unwrap any of the keys (Page 12 of https://www.apple.com/business/docs/iOS_Security_Guide.pdf).

Feb 17, 2016

Sure thing.

Starting with the A7 CPUs, the iPhone CPU has a "secure enclave" which is basically a miniature SoC within the SoC. The secure enclave has its own CPU with its own secure boot chain and runs independently of the rest of the system. It runs a modified L4 microkernel and it does all of low-level key management.

The secure enclave contains a unique ID burned into the hardware. This ID can be loaded as a key into the hardware AES engine, but is otherwise designed to be completely inaccessible. Assuming AES is secure, that means the key can be used to encrypt data but can't be extracted, not even by the supposedly secure software running in the secure enclave. This key is then used to generate other keys, like the ones used to encrypt files. That means you can't extract the flash memory, connect it to a computer, and then try to brute force it from there. Or rather you can, but you'll be brute forcing a 256-bit AES key, not a 4-digit PIN, making it effectively impossible.

One of the secure enclave's tasks is taking a PIN (or fingerprint) and turning it into the encryption key needed to read the user's files. The main system just hands off the user's code to the secure enclave, and gets back either a key or a failure. The escalating delays with successive failures and wipe after too many failures are both done in the secure enclave. That means that updating the device's main OS won't affect it.

All of this is discussed in Apple's security guide here:

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

The one open question is software updates for the secure enclave. According to that guide, its software can be updated. Does that mean it can be updated with code that removes the restrictions and allows brute-forcing passcodes? The guide doesn't address how the updates work.

My guess, based on how meticulous Apple is about everything else, is that updates are designed to make this scenario impossible. The secure enclave must be unlocked to apply an update, or if updated without unlocking it wipes the master keys. This would be pretty simple to do, and it would fit in with the rest of their approach, so I think it's likely that this is how it works, or something with the same effect.

Feb 17, 2016

Please see this link[1], Apple explains exactly how keys are stored in their datacenters (Hint: it is not in clear text). They use HSM's which destroy the user's key after 10 failed attempts.

[1]: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 17, 2016

That's what the A7 (iPhone 5S and later) design does:

“Each Secure Enclave is provisioned during fabrication with its own UID (Unique ID) that is not accessible to other parts of the system and is not known to Apple. When the device starts up, an ephemeral key is created, entangled with its UID, and used to encrypt the Secure Enclave’s portion of the device’s memory space. Additionally, data that is saved to the file system by the Secure Enclave is encrypted with a key entangled with the UID and an anti-replay counter.”

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

The device in question is an iPhone 5C, which uses the older A6 design.

Feb 17, 2016

Yes, see page 12 here https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 17, 2016

This is a good overview https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 17, 2016

What it means is that the best the FBI can come up with is "Make a way for us to brute force attack the passphrase." And brute force attack is worthless for a strong enough passphrase. That's what is reassuring.

Not to mention that this is for the iPhone 5c. As other comments have mentioned, newer iPhones have the hardware-based Secure Enclave which add to the difficulty of breaking into the phone. https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 17, 2016

There are escalating time limits on incorrect PIN attempts, which is also enforced in hardware by the Secure Enclave in A7 chips and above. This would mean even breaking a 4-digit pin code without that delay being removed would take a long time. Additionally the device may be set to wipe after 10 incorrect attempts.

Attempts | Delay

1-4: none

5: 1 minute

6: 5 minutes

7-8: 15 minutes

9: 1 hour

Source: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 17, 2016

read section 'Hardware Security Features' here: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 17, 2016

Publicizing the case themselves in a very good move.

However, the iPhone of the attacker is an iPhone 5C, which does not have Touch ID or a Secure Enclave. This means that the time between passcode unlock attempts is not enforced by the cryptographic coprocessor. More generally, there's no software integrity protection, and the encryption key is relatively weak (since it is only based on the user's passcode).

The amount of work needed to turn security into good user experience is phenomenal: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 17, 2016

if you are interested in the technical details of iOS security: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 16, 2016

Are you suggesting they can brute force AES-256?

iOS's security is quite sophisticated: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 16, 2016

It is used to create the crypto key, using a password based key derivation function, using the user's password fed into the PBKDF the output is the key used for encryption/decryption.

The users device key is mixed into that PBKDF. Without both parts of the equation, you have nothing.

For your reading enjoyment: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Specifically page 11 the diagram at the bottom.

Feb 16, 2016

No. From Apple's iOS security guide[1]:

> The device’s unique ID (UID) and a device group ID (GID) are AES 256-bit keys fused (UID) or compiled (GID) into the application processor and Secure Enclave during manufacturing. No software or firmware can read them directly; they can see only the results of encryption or decryption operations performed by dedicated AES engines implemented in silicon using the UID or GID as a key. Additionally, the Secure Enclave’s UID and GID can only be used by the AES engine dedicated to the Secure Enclave. The UIDs are unique to each device and are not recorded by Apple or any of its suppliers. ... Integrating these keys into the silicon helps prevent them from being tampered with or bypassed, or accessed outside the AES engine. The UIDs and GIDs are also not available via JTAG or other debugging interfaces.

Even for older devices like the iPhone 5C, if the owner chose a good passphrase, I doubt it can be decrypted with Apple's help.

1. From the section on Encryption and Data Protection. Starts on page 10: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 16, 2016

Apple goes way out of their way to avoid scenarios where they can be compelled to subvert iOS security. For instance, see pg44+ of the iOS security white paper:

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

... the HSMs that manage the escrow scheme for credentials stored in iCloud are themselves rigged to blow up on 10 failed tries, and, not only that, but the code that implements that process is burned into the HSMs and the keys Apple would need to change that logic have been destroyed.

Feb 16, 2016

Page 11 of the iOS Security Guide should help explain a lot: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

The user passcode is combined with the UID Key embedded in the Secure Enclave to create an encryption key used for the filesystem (Apple calls this "tangling"). That means you can only crack the key for the filesystem as fast as the Secure Enclave lets you crack it, since every guess is composed of Passcode+Secure_Enclave_Access.

And boy, does the Secure Enclave not like to go fast. Every incorrect guess gets fed back to the SE and it gets slower and slower until you can only try 1 guess an hour or, if the user set it, total device erasure after a certain number of failed attempts. See the table on page 12 of the iOS Security Guide.

This is why disk crypto on iOS is far better than comparable alternatives on Android. Having a hardware crypto chip with a key embedded at manufacture time on every single phone they produce is something that only Apple can really do.

The top poster is correct: in iOS7 and prior there were many default apps that did not use the Data Protection API (aka file encryption). Post-iOS7, most default apps and many 3rd party apps have defaulted to using Data Protection. This means you get very little if you're trying to forensically acquire a disk, as the FBI is, without access to the phone passcode.

Sidenote: My company released a crypto abstraction library for interacting with the Secure Enclave last week. It lets mobile app developers instruct the SE to create an ECC private key on their behalf and then sign things with it. This way, you can make passwordless authentication and device binding on iOS possible for your app, potentially improving UX, increasing security, and simplifying your server-side code. Check it out at https://www.passwordlessapps.com -- we couldn't have made these kinds of security guarantees without the Secure Enclave. We'll eventually support Android, but we'll have to give up some security benefits to do so.

EDIT: I've been informed the device in question is an iPhone 5C (no TouchID == no Secure Enclave). This should make things a lot simpler for the FBI. Now I actually do wonder why they're having trouble?

Feb 06, 2016

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 06, 2016

Occam's Razor says that you should select the hyposthesis with the fewest assumptions. Saying Apple is a "rent seeking asshole" assumes that Apple did this maliciously, which is a huge ball of assumptions when they've literally put out a security paper[1] on how Touch ID and Security Enclave works.

[1]:https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 06, 2016

I highly recommend reading up on Apple's Security white paper that details how it all works... https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 06, 2016

Basically, yes. The Secure Enclave is hardware isolated from the rest of the chip.

Apple's own security guide explains it best [1]:

> The Secure Enclave is responsible for processing fingerprint data from the Touch ID sensor, determining if there is a match against registered ngerprints, and then enabling access or purchases on behalf of the user. Communication between the processor and the Touch ID sensor takes place over a serial peripheral interface bus. The processor forwards the data to the Secure Enclave but cannot read it. It’s encrypted and authenticated with a session key that is negotiated using the device’s shared key that is provisioned for the Touch ID sensor and the Secure Enclave. The session key exchange uses AES key wrapping with both sides providing a random key that establishes the session key and uses AES-CCM transport encryption.

Regarding the actual fingerprint storage, it looks like the encryption key is kept in the Secure Enclave and the entire decryption and verification process occurs within the Secure Enclave. However the encrypted data itself may be stored outside the Secure Enclave:

> The raster scan is temporarily stored in encrypted memory within the Secure Enclave while being vectorized for analysis, and then it’s discarded. The analysis utilizes sub-dermal ridge flow angle mapping, which is a lossy process that discards minutia data that would be required to reconstruct the user’s actual fingerprint. The resulting map of nodes is stored without any identity information in an encrypted format that can only be read by the Secure Enclave, and is never sent to Apple or backed up to iCloud or iTunes.

[1] https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 06, 2016

Yes, at least according to Apple: https://www.apple.com/business/docs/iOS_Security_Guide.pdf (page 7).

"The Secure Enclave is responsible for processing fingerprint data from the Touch ID sensor, determining if there is a match against registered fingerprints, and then enabling access or purchases on behalf of the user. Communication between the processor and the Touch ID sensor takes place over a serial peripheral interface bus. The processor forwards the data to the Secure Enclave but _cannot_ read it".

Jan 17, 2016

The secure enclave on iOS devices is basically a TPM. See https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Jan 15, 2016

This is similar to how Apple's iOS File Data Protection works with "Protected Unless Open": https://www.apple.com/business/docs/iOS_Security_Guide.pdf

> Some files may need to be written while the device is locked. A good example of this is a mail attachment downloading in the background. This behavior is achieved by using asymmetric elliptic curve cryptography (ECDH over Curve25519). The usual per-file key is protected by a key derived using One-Pass Diffie-Hellman Key Agreement as described in NIST SP 800-56A.

> The ephemeral public key for the agreement is stored alongside the wrapped per-file key. The KDF is Concatenation Key Derivation Function (Approved Alternative 1) as described in 5.8.1 of NIST SP 800-56A. AlgorithmID is omitted. PartyUInfo and PartyVInfo are the ephemeral and static public keys, respectively. SHA-256 is used as the hashing function. As soon as the file is closed, the per-file key is wiped from memory. To open the file again, the shared secret is re-created using the Protected Unless Open class’s private key and the file’s ephemeral public key; its hash is used to unwrap the per-file key, which is then used to decrypt the file.