On Apple's Cloud Key Vault - and why it *might* actually be secure

There is a nice writeup on this at Lawfare.  (TL;DR - holding on to keys in a safe way such that 3rd parties can also access them repeatedly without high potential for catastrophic loss is impossibly hard)
...Apple’s design intentionally solved the problems that come from exceptional access schemes by removing itself from the equation. Rather than providing an exceptional access solution, Apple took the radical step of destroying those keys in order to have an acceptable level of protection. 
[To] turn Apple’s Cloud Key Vault into an exceptional access mechanism....Apple would have to replace the HSM with one that accepts an additional message from Apple or the FBI—or an agency from any of the 100+ countries where Apple sells iPhones—saying “OK, decrypt,” as well as the user’s password. In order to do this securely, these messages would have to be cryptographically signed with a second set of keys, which would then have to be used as often as law enforcement access is required. Any exceptional access scheme made from this system would have to have an additional set of keys to ensure authorized use of the law enforcement access credentials.Managing access by a hundred-plus countries is impractical due to mutual mistrust, so Apple would be stuck with keeping a second signing key (or database of second signing keys) for signing these messages that must be accessed for each and every law enforcement agency. [As a result] Apple [would need] to protect another repeatedly-used, high-value public key infrastructure: an equivalent situation to what has already resulted in the theft of Bitcoin wallets, RealTek’s code signing keys, and Certificate Authority failures, among many other disasters. 
Repeated access of private keys drastically increases their probability of theft, loss, or inappropriate use. Apple’s Cloud Key Vault does not have any Apple-owned private key, and therefore does not indicate that a secure solution to this problem actually exists. 
...Among the nontechnical hurdles would be the requirement, for example, that Apple run a large legal office to confirm that requests for access from the government of Uzbekistan actually involved a device that was located in that country, and that the request was consistent with both US law and Uzbek law.[It's] not that the technical community doesn’t know how to store high-value encryption keys—that’s the whole point of an HSM. Rather, we assert that holding on to keys in a safe way such that any other party (i.e. law enforcement or Apple itself) can also access them repeatedly without high potential for catastrophic loss is impossible with today’s technology.
There is quite a bit to unpack there, and I could go into it, but I'll leave it to the comment's on Schneier's (excellent) blog. Its worth reading it all, but most of it comes down to
  • impossible doesn't mean what you think it means - what is computationally impossible today is not so much in 10 years, and that doesn't even begin to get into quantum cryptography, etc.
  • Apple can potentially update it's custom code on the HSM to do all sort sorts of "fun" things
  • There is no source code, there are no design docs, and there really really is no no sunlight on any of this. We basically have Apple's word on whats going on back there. Do you actually, really trust them?
  • And, of course, all of this presumes that the _other_ weaknesses in the system are not exploitable...


Comments

Popular posts from this blog

Cannonball Tree!

Erlang, Binaries, and Garbage Collection (Sigh)

Visualizing Prime Numbers