In early 2016, the FBI went to court in an effort to force Apple to write custom “back door” software for use against recent encrypted iPhones. In a move nobody would have expected from Apple prior to the Snowden revelations, Apple stood up to the bullies and refused to cooperate. As this is written nobody knows how far Apple is prepared to go. As of today it appears they can’t get in, nobody knows if that will be true if a month or when the statute of limitations for that tomato you threw at a neo-Nazi runs out.
What the FBI is demanding is that Apple provide a “crack” or backdoor that could be installed after the fact on a phone taken while locked from a previously unknown target. For political reasons they are using the San Bernardino (ISIS/Daesh) shooter’s phone as the test case. It is known they have wanted laws restricting encryption for at least a year prior to this incident-and that they actually want Apple to help them unlock at least ten more phones. This is not about Daesh, and not about Apple either: it’s about the FBI wanting to be able to unlock YOUR phone, YOUR data any time they want.
Now for the technical part of this: The FBI cannot “brute force” the encryption on recent iPhones because ten wrong passphrases triggers automatic destruction of all data. This is a brillent security solution, but apparently it is a band-aid. The FBI claims they could quickly try all possible keys and thus guarantee they could get in without it. If and only if this is true, it means the iPhone relies on weak encryption “protected” by closed proprietary aoftware wiping the data and presumably preventing anyone from simply making thousands of “cloned” copies of the locked, encrypted data for unlimited attempts.
Normal forensic practice against computers is to image (make a bit for bit copy of) the hard drive, verify the copy is good, then never touch the original device again as a precaution against accidently damaging it. All subsequent work is done on copies. The news reports about the FBI’s captured iPhones indicate something is preventing them from doing this. That something must be the iPhones closed, proprietary firmware such a defense is beyond the purview of enncryption software.
In the field of cryptography, a cipher is only considered secure if an adversary can know everything about it except the actual key used, and then only if the number of possible keys exceeds the ability of an adversary to try them all. A lock on your door that can be defeated anyone who knows how it goes together without needing the key is no lock at all. A lock that can be opened by trying every key on the burglar’s keyring in it is no good either. If the news reports are true, Apple’s security is comparable to that lock being on a safe full of secret papers and a special ignition device burning the safe’s contents if ten wrong keys are tried. The FBI wants Apple to help them defuse the igniter, so they can open the lock, which they say has too few possible keys.
If Apple built the iPhone to use all of the possible keys of a modern encryption cipher like AES-256 and not just a few of them, circa 2008 reports said it would take the whole surface of the Earth covered with computers 4,000 years to try them all. Today it would still be centuries, maybe as much as 1,000 years if only CPU and not graphics card based computers were used. This is only true if the user picked a strong passphrase, not something like “God123456,” which would be among the first dozen or so tried. Any one-word passphrase can be cracked in less than a second by an ordinary desktop computer, so long as the word is found in some dictionary in some language.
If the passphrase is the username,”123456,” “admin,” or “God” it will be one of the first three tried and could be cracked in less than a minute by a person simply using the device normally. I myself have found this useful when encountering toilets locked with the common 5 button mechanical locks. Things like “1-2-5” and “1-2-4” seem to work a lot more often than one would expect, and there are only 128 possible combinations as no button can be used twice.
A multiple word phrase not found in any book ever published is the minimum necessary to secure an ordinary computer against any kind of determined decryption attempt not only by law enforcement but by any hacker with access to a gaming computer. Gaming computers running powerful graphics card based password-cracking software are known as “first password shooters” and essentially can get into anything the FBI or Secret Service can crack by trying lots of different keys. Once such attackers are in, things like banking and credit card information are at risk of being sold to the highest bidder. A long passphrase not found in any book can stop these systems, a long random character string is the “gold standard” of security but hard to memorize. A passphrase written down or stored on a flash drive is equally useless against police raids and any other burglars.
Thus, Apple used the data destroying backup security system. I do not know if it is possible to use an actual secure passphrase to lock an iPhone, if it is and the intended target used one the FBI’s demand would not help them at all. Only an always-on keylogger, data backup to a cloud server that cannot be turned off, or prior access to a known target to install/activate a passphrase logger could help them then.
Apple now says they intend to modify future iPhones so the firmware cannot be “updated” or modified on a locked machine without the passphrase at all, no matter what any court may demand. This might be insurance against someone like Donald Trump taking office, which could create a nightmare scenario where millions of people in the US depend on encryption to stay out of internment camps or worse.
The problem then becomes this: an iPhone that is secure today becomes insecure tomorrow if the NSA or Secret Service manages to create new hardware for cloning all the encrypted data (ciphertext) stored in that phone to an ordinary computer hard drive. From an unlocked phone bought at an Apple store they could reverse-engineer the encryption routine trivially to determine which keys are possible keys for the otherwise-strong AES encryption supported by the chips used in these phones. Once they knew what small subset of keys the iPhone’s cipher lock could ever generate, and had a way to create the still-locked backup copies they could try all possible keys no matter what Apple does.
There are only two possible defenses against this. The strongest would be for Apple to make it possible to use a 30 character plus passphrase to lock a phone. The FBI and probably even the NSA could do nothing about this, not even if they waterboarded the CEO of Apple in a prison cell. Less strong for users of strong passphrases but more secure for everyone else would be to find a way to “salt” the generated keys from a user’s passphrase so no two phones would have the same list of possible keys. The problem with that approach is that the software/firmware handling the key generation has to be stored in the unencryted part of the data so as to be able to use it to unlock the rest of the data. Thus it would be vulnerable to after-the-fact forensic analysis of that particular phone. Also, you would have to know that Apple was still selling the secure phones and had not secretly removed part of the security in response to orders from the FBI, the NSA, etc.
As for an iPhone, an Android phone, a laptop, a desktop computer, or any other device that has passed through the custody of an opponent, once this has happend the device is considered forever unsafe for handling encrypted data. If you are arrested with a smartphone or any other computing device, it must never be used again and must be destroyed and replaced. The replacement should be bought randomly off the shelf with cash, never ordered in advance. It is trivial to install hardware keyloggers, firmware keyloggers, or software keyloggers in a device that is physically in your hands with enough time to finish the job, so throw away your phone every time you are arrested with one! This is even more important now than before, as keyloggers are one of the few ways to capture the passphrase of an encrypted computer, which is what an encrypted iPhone is.