The 2016 encrypted Apple iPhones: are they really secure(full details)?

Udated 4-13-2017
Note to users: Android phones and presumably iPhones as well are only protected by encryption when turned all the way off! The filesystem is presumably still mounted (and thus decrypted) when the device is only locked. Cops don’t like locked phones either but they HATE encrypted phones captured all the way off.

In early 2016, the FBI went to court in an effort to force Apple to write custom “back door” software for use against recent encrypted iPhones. In a move nobody would have expected from Apple prior to the Snowden revelations, Apple stood up to the bullies and refused to cooperate. In the end, Apple stood strong and the FBI resorted to an outside hacker presumably to bypass some part of Apple’s security protecting devices “encrypted” with those weak numerical “knock patterns.”

What the FBI tried and failed to demand from Apple was that they themselves write a “crack” or backdoor that could be installed after the fact on a phone taken while locked from a previously unknown target. For political reasons they used the San Bernardino (ISIS/Daesh) shooter’s phone as the test case. It is known they wanted laws restricting encryption for at least a year prior to this incident-and that they actually want Apple to help them unlock at least ten more phones. This is not about Daesh, and not about Apple either: it’s about the FBI wanting to be able to unlock YOUR phone, YOUR data any time they want.

Now for the technical part of this: The FBI cannot “brute force” even weak knock pattern encryption on recent iPhones because ten wrong passphrases triggers automatic destruction of all data. This is a brillent security solution, but apparently it is a band-aid. The FBI claims they could quickly try all possible keys and thus guarantee they could get in without it(barring a full alphanumeric/multiword passphrase). If and only if this is true, it means the iPhone relies on weak encryption “protected” by closed proprietary aoftware wiping the data and presumably preventing anyone from simply making thousands of “cloned” copies of the locked, encrypted data for unlimited attempts.

I don’t own any Apple hardware as I dispprove of Apple’s “walled garden” business model so I do not know if the iPhone can be set up to use a real multiword or dozens of random character passphrase like you’d use to encrypt a computer hard drive, but Android phones certainly can and should use such a passphrase. Android does not destroy the file system after too many wrong passphrases, but you can use a real passphrase and not just a numerical “knock pattern” to encrypt it. This is far superior to relying on the device itself. A real passphrase on a touchscreen is a slow nuisance to enter everytime you start the phone but a nightmare for anyone trying to brute-force entry. Not only that, Android uses the same DM-crypt “LUKS” disk encryption system Linux computers can use, since Android uses a Linux kernel. The only known countermeasure is a keylogger or disk key recorder installed in advance against a target (and a phone) known in advance. The hack used against Apple was all about defeating the filesystem wipe so a brute-force attack could be attempted, and this sort of thing does not defeat strong passphrases.

Normal forensic practice against computers is to image (make a bit for bit copy of) the hard drive, verify the copy is good, then never touch the original device again as a precaution against accidently damaging it. All subsequent work is done on copies. The news reports about the FBI’s captured iPhones indicate something is preventing them from doing this. That something must be the iPhones closed, proprietary firmware and such a defense is beyond the purview of enncryption software.

In the field of cryptography, a cipher is only considered secure if an adversary can know everything about it except the actual key used, and then only if the number of possible keys exceeds the ability of an adversary to try them all. A lock on your door that can be defeated by anyone who knows how it goes together without needing the key is no lock at all. A lock that can be opened by trying every key on the burglar’s keyring in it is no good either. If the news reports are true, Apple’s defense against brute-force attack on phone encryption is comparable to that lock being on a safe full of secret papers and a special ignition device burning the safe’s contents if ten wrong keys are tried. The FBI wanted Apple to help them defuse the igniter, so they could open the lock, which they say has too few possible keys or user using keys out of too small a subset of keys.

If Apple built the iPhone to use all of the possible keys of a modern encryption cipher like AES-256 and not just a few of them, circa 2008 reports said it would take the whole surface of the Earth covered with computers 4,000 years to try them all. Today it would still be centuries, maybe as much as 1,000 years if only CPU and not graphics card based computers were used. Thus passphrases are attackable but the actual disk keys are not.

Encryption normally is only effective if the user picks a strong passphrase, not something like “God123456,” which would be among the first dozen or so tried. Any one-word passphrase can be cracked in less than a second by an ordinary desktop computer, so long as the word is found in some dictionary in some language.
If the passphrase is the username, “123456,” “admin,” or “God” it will be one of the first four tried and could be cracked in less than a minute by a person simply using the device normally. I myself have found this useful when encountering toilets locked with the common 5 button mechanical locks. Things like “1-2-5” and “1-2-4” seem to work a lot more often than one would expect, and there are only 128 possible combinations as no button can be used twice.

A multiple word phrase not found in any book ever published is the minimum necessary to secure an ordinary computer against any kind of determined decryption attempt not only by law enforcement but by any hacker with access to a gaming computer. Gaming computers running powerful graphics card based password-cracking software are known as “first password shooters” and essentially can get into anything the FBI or Secret Service can crack by trying lots of different keys. Once such attackers are in, things like banking and credit card information are at risk of being sold to the highest bidder. A long passphrase not found in any book can stop these systems, a long random character string is the “gold standard” of security but hard to memorize. A passphrase written down or stored on a flash drive is equally useless against police raids and any other burglars.

In summary, Apple used the data destroying backup security system to cover for the very weak “knock patterns” commonly used for smartphone encryption. I do not know if it is possible to use an actual secure passphrase to lock an iPhone (like on Android), if it is and the intended target used one the FBI’s demand would not help them at all. Only an always-on keylogger, data backup to a cloud server that cannot be turned off, or prior access to a known target to install/activate a passphrase logger could help them then.

Apple now says they intend to modify future iPhones so the firmware cannot be “updated” or modified on a locked machine without the passphrase at all, no matter what any court may demand. This might be insurance against someone like Donald Trump demanding routine backdoor installation, which could create a nightmare scenario where millions of people in the US depend on encryption to stay out of internment camps or worse.

It is also unknown if Apple’s cipher chips (closed and secret) even use all available keyspace and strong random number generation in creating the disk keys in the first place. If not, the problem then becomes this: an iPhone that is secure today becomes insecure tomorrow if the NSA or Secret Service manages to create new hardware for cloning all the encrypted data (ciphertext) stored in that phone to an ordinary computer hard drive. From an unlocked phone bought at an Apple store they could reverse-engineer the encryption routine trivially to determine which keys are possible keys for the otherwise-strong AES encryption supported by the chips used in these phones. Once they knew what small subset of keys the iPhone’s cipher lock could ever generate, and had a way to create the still-locked backup copies they could try all possible keys no matter what Apple does.

Also, you have to know that Apple was still selling the secure phones and had not secretly removed part of the security in response to orders from the FBI, the NSA, etc.

As for an iPhone, an Android phone, a laptop, a desktop computer, or any other device that has passed through the custody of an opponent, once this has happend the device is considered forever unsafe for handling encrypted data. If you are arrested with a smartphone or any other computing device, it must never be used again and must be destroyed and replaced. Hit it with a hammer and throw it out. You can’t remove the battery from an iPhone so do this outdoors in case you hit the battery and it explodes like a firecracker. Usually they don’t, but an occasional battery might go “full Samsung” on you. Most Android phones you can remove the battery to avoid this risk.

The replacement should be bought randomly off the shelf with cash, never ordered in advance. It is trivial to install hardware keyloggers, firmware keyloggers, or software keyloggers in a device that is physically in your hands with enough time to finish the job, so throw away your phone every time you are arrested with one! This is even more important now than before, as keyloggers are one of the few ways to capture the passphrase of an encrypted computer, which is what an encrypted phone of any kind is.