Dec 13, 2017

Tangentially, I wonder if most modern smartphone chips include a hardware random number generator, and if that is exposed to userspace?

The iPhone has hardware random number generator, at least: "The Secure Enclave is a coprocessor fabricated in the Apple S2, Apple A7, and later A-series processors. It uses encrypted memory and includes a hardware random number generator."

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

I couldn't immediately find if that functionality was exposed in an API.

Nov 25, 2017

> It’s a black box that we’re not supposed to know anything about

Nope. Apple published a whitepaper that details how the SEP works.[1] Decrypting the firmware does help researchers look for vulnerabilities in the implementation, but it's not like Apple is relying on it being a black box.

[1] https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Nov 25, 2017

It does more than that.

From the [ios security guide]:

> The Secure Enclave provides all cryptographic operations for Data Protection key management and maintains the integrity of Data Protection even if the kernel has been compromised.

e.g. you can encrypt and decrypt, referencing a key by id, but without having the private key ever leave the enclave, even if the app or iOS kernel gets compromised.

[ios security guide] https://www.apple.com/business/docs/iOS_Security_Guide.pdf

The Secure Enclave section is pretty short and the entire document is very approachable.

Nov 13, 2017

Interesting. Can you point to any official white-paper from Apple claiming this? I'm reading this: https://www.apple.com/business/docs/iOS_Security_Guide.pdf but I cannot find any such information about a living person.

Nov 08, 2017

It has nothing to do with ISIS…

The FBI tried to access the San Bernardino shooter's iPhone 5c but because they screwed up by trying to access the phone on a wi-fi network it hadn't ever been on, they put themselves in a position of having the iPhone erased after a certain number of incorrect password attempts.

The FBI wanted Apple to create a special version of iOS they could load on the phone to bypass the auto erase feature and Apple refused.

Apple didn't say they could access the Texas shooter's iPhone; only that they could possibly provide some assistance. It would depend on the model and whether or not encryption was enabled, something we currently don't know.

Apple, like most tech companies, routinely works with law enforcement; we don't hear about it unless it's a mass shooting that gets nationwide attention.

Bottom line: if the phone is encrypted and certain other features are enabled, there's nothing Apple can do.

You can read all about it at https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Oct 31, 2017

I don’t think Apple could sell transaction data:

> Apple Pay retains anonymous transaction information such as approximate purchase amount. This information can’t be tied back to the user and never includes what the user is buying.

https://www.apple.com/business/docs/iOS_Security_Guide.pdf, page 38.

They don’t have much information that the issuing bank/card network do not have already.

Oct 24, 2017

Apple does take security quite seriously: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

The iOS 11 version and detailed FaceID details should be coming out soon as well.

Sep 27, 2017

This document has more information on the Secure Enclave.

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Sep 27, 2017

Secure Enclave isn't somehthing new to FaceID. It was developed for Touch ID. Some details here: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Sep 17, 2017

Regarding your first point, it's difficult to implement some security schemes at the operating system level alone. With full vertical control of the product, you can have nice things like secure enclaves and de-facto hardware cryptography acceleration.

See here for details: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

A lot of the features would be very difficult to implement in Android without cooperating hardware, and hardware is notoriously expensive to get right and scale up. Projects like neo900 and Purism regularly encounter delays, unexpected costs, and pricing issues. It's really tough.

On a broader note, people are spending more and more time in data-hungry apps anyway, which can send almost anything they want to the network. This is sure to chip at any device-level security, pushing it towards irrelevance. I wish I had a log entry every time an app used the location service on my phone along with a database containing a history of Internet-transacted data.

Sep 17, 2017

It all boils down to what you see in the public. The FBI was trying to get Apple to create an iOS variant to extract data from a device, to the point where Tim Cook wrote a letter refusing to do so and was willing to fight it in court. In a similar fashion, Apple has given talks and white papers on iOS security. In tandem with these well documented white papers and talks discussing the internal functions of iOS, we haven't seen any well known or trivial exploits appear to surface that demonstrate flaws in them.

In the case of the FBI fiasco, for example, we know that the FBI later paid Cellebrite for an exploit that allowed them to unlock the device. We don't know the details of how that happened, but the FBI had to reach out to an independent company and exploit an older device to do it.

To be clear: if they were lying about the lengths they go to with encryption, on device security, and user trust, this story would have been different. The FBI would have already had a backdoor or been able to trivially break the device, or they would have complied and created an iOS variant to break in. All we know is what they stood their ground on, and the effort that was required for the FBI to get in.

iOS 10 security white paper: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Talk about iOS security at Black Hat: https://www.youtube.com/watch?v=BLGFriOKz6U

Sep 11, 2017

Here's the whitepaper about iOS security: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Sep 11, 2017

Cheers. I'd appreciate it.

Edit to add: I came across this one:

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

It was linked from an article in 2014[1], though this PDF document is dated March 2017.

[1]: http://www.biometricupdate.com/201402/apple-publishes-whitep...

Jul 31, 2017

Apple Pay operates using network tokenization, which is part of the EMV standard. This works in two parts.

When you add a card to Apple Pay, your device checks if the bank can use Apple Pay, and gets a device token from your bank, along with some cryptographic data. The original card number your typed in is discarded; it is replaced by the device token, which looks like a credit card number. All this data is stored in the device’s Secure Enclave. It’s important to note that Apple does not store card data, and doesn’t even take part in the transaction flow.

When you pay with Apple Pay, your device generates a network token. In practice, this network token is your device token, which is sent along with a cryptographic signature, the cryptogram (generated using the data previously stored into the Secure Enclave). This data fits into traditional card fields (name/number/expiration date/cvv), plus an additional field for the cryptogram. The token is transmitted through the merchant to their gateway/acquirer, which sends passes it to the payment network (Visa/Mastercard). The network checks the cryptogram, does a reverse lookup of the device token and associates it with the real card number. The transaction proceeds like a normal transaction using the real card number.

https://www.emvco.com/wp-content/uploads/documents/EMVCo_Pay... : The official tokenization specification. Page 71 has a good diagram of the flow (the Token Service Providers are usually the networks themselves).

https://www.apple.com/business/docs/iOS_Security_Guide.pdf : The iOS Security Guide, which has interesting information in the Apple Pay section (page 34).

IIRC, Android Pay works using the same standard, with Google storing the device tokens in the cloud and generating network tokens on the behalf of the user (not all Android devices have secure elements). May be inaccurate, I haven’t had the opportunity to implement it at the payment provider level (yet!).

Jun 11, 2017

> I've done it several times. The ability to restore messages to a device...

Interesting. To be clear, you're not just talking about restoring messages to the same device, like after resetting it?

Looking more closely at Apple's security whitepaper, perhaps restoring history on a new device is possible if you enable iCloud Keychain. Looks like that would in fact share the private decryption keys among devices.[1]

Ah, and that more clearly points at what this iMessage change may be: Mandatory iCloud Keychain, at least as far as iMessage keys are concerned. Which would suggest another, hidden improvement: no more need to redundantly encrypt a copy of every message for every recipient device!

I want to add however that this still does not suggest anything about changing the security of backups, which was the implication of the article. Nor would I necessarily characterize iCloud keychain as "breaking" encrypted architecture.

[1] https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Jun 11, 2017

We don't have to speculate how Apple could possibly handle account recovery without entirely sacrificing security, because it's spelled out in their iOS security whitepaper: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

TL;DR: Keychain recovery relies on a cluster of hardware security modules to enforce the recovery policy. After 10 tries to guess your PIN, the HSM will destroy the keys. Apple support gates all but the first few of these tries. The paper also implies that you can use a high entropy recovery secret as an alternative, though I can't figure out how you would enable that.

This seems like a pretty reasonable point in design space to me. Of course, you are relying on Apple's trustworthiness and competence to implement this design. But that is true without recovery, since the client software is also implemented by Apple.

Jun 07, 2017

> someone in law enforcement tasked with retrieving message logs for investigations.

Right, but do they retrieve them from iCloud? Without Apple's assistance, and without knowing the user's password?

> I was pretty skeptical but I've yet found any proof or documentation from Apple's support docs disproving this.

Well, here's the brief overview: https://support.apple.com/en-us/HT202303

and here's the iOS security whitepaper: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Which includes a section about iCloud security, including the following section:

  iCloud secures the content by encrypting it when sent 
  over the Internet, storing it in an encrypted format, 
  and using secure tokens for authentication.
I am no security expert, but I am pretty sure FBI wouldn't have a huge fight with Apple if they had any way to get to the data directly (and once they figured out they could use a vuln in the old iOS to break into the device, they did indeed drop the fight).

> FBI/Apple had the ability to get historic message history from the iCloud backup

Right, because they reset the shooter's Apple ID password. Not because the backup was in plaintext.

> As for your scenario -- doesn't that explicitly confirm that the messages are not encrypted safely at rest? You can restore to an entirely new device, using the same backup, and retrieve the messages.

How does that follow? You still need to supply your password to decrypt the backup before you can restore it. From the same security whitepaper:

  When files are created in Data Protection classes that aren’t accessible 
  when the device is locked, their per-file keys are encrypted using the 
  class keys from the iCloud Backup keybag. Files are backed up to iCloud 
  in their original, encrypted state. Files in Data Protection class 
  No Protection are encrypted during transport.

  The iCloud Backup keybag contains asymmetric (Curve25519) keys for each 
  Data Protection class, which are used to encrypt the per-file keys. For 
  more information about the contents of the backup keybag and the iCloud 
  Backup keybag, see “Keychain Data Protection” in the Encryption and Data 
  Protection section.

  The backup set is stored in the user’s iCloud account and consists of a 
  copy of the user’s files, and the iCloud Backup keybag. The iCloud Backup 
  keybag is protected by a random key, which is also stored with the backup 
  set. (The user’s iCloud password isn’t utilized for encryption so that 
  changing the iCloud password won’t invalidate existing backups.)

  While the user’s Keychain database is backed up to iCloud, it remains 
  protected by a UID-tangled key. This allows the Keychain to be restored 
  only to the same device from which it originated, and it means no one 
  else, including Apple, can read the user’s Keychain items.

  On restore, the backed-up files, iCloud Backup keybag, and the key for 
  the keybag are retrieved from the user’s iCloud account. The iCloud Backup 
  keybag is decrypted using its key, then the per-file keys in the keybag 
  are used to decrypt the files in the backup set, which are written as new 
  files to the file system, thus re-encrypting them as per their 
  Data Protection class.

Jun 06, 2017

You should read the iOS security guide portion that explains Apple Pay, the security benefits and amount of thought that has gone into the system are pretty amazing.

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

May 11, 2017

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Find the section titled "Secure Enclave." When the SE needs to store data on the filesystem, it's encrypted with a key that never leaves the SE. Effectively, assuming the encryption is implemented correctly, data 'owned' by the SE is never available to any other part of the system.

May 11, 2017

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

You're looking for the bit on the "Secure Enclave".

Apr 03, 2017

Apple's been shipping hardware DMA restrictions on Macs for awhile - that first appeared on the security radar in the FireWire era – so it's not inconceivable that it's not as simple as getting DMA access but https://www.apple.com/business/docs/iOS_Security_Guide.pdf seems to be silent on that unless I'm missing something.

Mar 30, 2017

iMessage has end to end encryption and supports multiple devices. Each device uses its own keypair and when a message is sent, it is sent encrypted for each of the recipient's devices.

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Mar 28, 2017

This is known behavior: according to the iOS 10 Security white paper [1], "iOS uses randomized Media Access Control (MAC) address when conducting Wi-Fi scans while it isn't associated with a Wi-Fi network... Note that Wi-Fi scans which happen while trying to connect to a preferred Wi-Fi Network aren't randomized".

I haven't put much thought into it, but I wonder why they don't randomize all probe requests...

[1]: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Mar 23, 2017

Check out the iOS Security Guide from Apple for such information: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Mar 23, 2017

if you're interested in how iOS security works, apple publishes white papers on the subject.

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Feb 28, 2017

iCloud and iCloud keychain are not really the same thing. iCloud keychain is designed not to disclose user passwords in the event of an iCloud account compromise, for example, among other things. The iOS Security Guide[1] has more details on this topic, starting on page 45.

That's not to say other solutions can never be as secure, but it's a fairly good design nevertheless.

[1]: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Jan 12, 2017

My understanding is that Apple Pay enables the payment, but Apple does not collect or retain any data such as to whom you paid how much. Here's what Apple's iOS Security Guide say:

"Apple Pay is also designed to protect the user’s personal information. Apple Pay doesn’t collect any transaction information that can be tied back to the user. Payment transactions are between the user, the merchant, and the card issuer."

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Jan 09, 2017

Apple might not be perfect, but they are ahead of the alternatives in terms of privacy and security.

Apple builds iOS to sell hardware. Google builds Android to vacuum up more data to sell.

Have you had a look at the iOS Security white paper [1]?

"Spotlight Suggestions never sends exact location, instead blurring the location on the client before sending."

"iOS also uses a randomized MAC address [...] so it can’t be used to persistently track a device by passive observers of Wi-Fi traffic."

Spotlight: "Unlike most search engines, however, Apple’s search service does not use a persistent personal identifier across a user’s search history to tie queries to a user or device; instead, Apple devices use a temporary anonymous session ID for at most a 15-minute period before discarding that ID."

Apple Pay: "Full card numbers are not stored on the device or on Apple servers. Instead, a unique Device Account Number is created, encrypted, and then stored in the Secure Element. This unique Device Account Number is encrypted in such a way that Apple can’t access it."

iMessage: "Apple does not log messages or attachments, and their contents are protected by end-to-end encryption so no one but the sender and receiver can access them. Apple cannot decrypt the data."

FaceTime: "The audio/video contents of FaceTime calls are protected by end-to-end encryption, so no one but the sender and receiver can access them. Apple cannot decrypt the data."

"Location Services can be turned off using a single switch in Settings, or users can approve access for each app that uses the service. [...] Additionally, users are given fine-grained control over system services’ use of location information."

Plus you have fine-grained controls for access to microphone, camera, pictures, etc. per app. Has Android caught up with that, or is it still all or don't use the app?

[1] https://www.apple.com/business/docs/iOS_Security_Guide.pdf