The Real Story Behind the Apple Privacy Statement

Photo Credit: PicJumbo
IMG 7446 [Editor’s note: Neither the author nor anyone associated with this blog is a lawyer of any kind. This blog is not to be taken for legal advice under any circumstances. If you have a personal privacy question of law, consult a trained and licensed attorney.]

There’s been a LOT of talk about how Apple is standing up to the Federal Government (and specifically the FBI) in the news, and it’s important to realize why the stance Apple is taking matters. This is not a blanket statement against the government cracking encryption (which is a good stance to take, but not what is at stake here).

The major issue is that what many people (even some IT Professionals) think is happening is not what is actually happening.

Basically any iPhone or iPad running iOS 8 and up produces a situation where the government cannot easily get to the data stored on a phone which has been locked with a 6 or more character passcode and disconnected from iCloud. The reasons for this are complex and highly technical, but the basic idea is that not even Apple can reverse the process of a phone locked in such a way. Mostly, this is because the phone’s own internal identification data is combined with the passcode to create a hash – a mathematical representation of the two values that makes up the key to unlock the encryption. Put in your passcode correctly, the mathematical equation output matches what the phone is expecting, and the phone unlocks. Put in the wrong passcode, and there’s no match, and the phone stays locked tight. Put in the wrong passcode enough times, and the phone forgets the key entirely, essentially permanently encrypting all the data – with the same impact as erasing all of it as far as the government is concerned.

In this case, a phone that was in the possession of one of the San Bernardino shooters has been locked with at least a 6 character passcode, and was disconnected from iCloud about a month before the shooting. That means that the government has 10 tries to get the code, or the phone irreversibly loses the encryption key, rendering all data sitting on the phone pretty much unreadable forever.

Here’s where things get tricky.

Apple is not saying they are refusing to unlock the phone for the FBI, or that they refuse to give the government anything Apple has access to directly. This is a common misconception widely reported by the media, and is flat out wrong. Apple *cannot* unlock the phone. It’s not physically or digitally possible for them to do it without changing the codebase that iOS 9 (which is on the phone) uses. Apple *can* give – and has already given – the government anything stored in iCloud. Apple has done this before when there is a valid warrant for that data, and it’s stored by Apple’s encryption, so they can reverse it and provide the info.

The issue here is that the shooter either broke iCloud backup, or manually turned it off, about a month before the shooting. That means that the majority of the information the government wants is located – and is *only* located – on the phone. Since Apple cannot reverse the locking mechanism of the phone, they do not have access to that information and can’t hand it over to the government even if they wanted to.

What Apple can do – and is refusing to do – is give the government a way to perform what is known as a “brute force” attack against the phone. A brute force attack is literally a person or computer trying combination after combination until they hit the right passcode. Normally, each try at the password takes a tiny amount of time to process, and iOS adds a tiny amount of time to that as a measure against exactly this kind of attack. To a user, this isn’t an issue, as a human entering a code won’t even notice it; but a brute force attack requires thousands of attempts to be processed automatically by a computer, and those tiny amounts of times add up to a LOT of extra time when you’re doing it at that level. The second – and more pressing – issue is that after 10 tries, the phone will never be un-encryptable. Ten tries is nowhere near enough to accomplish a brute force attack, and based on what the government is saying, they’re around try 8 right now with no success.

So what can Apple do? They can provide a signed version of the iOS software which can overwrite the restrictions in iOS which protect against such a brute force attack. Basically it would allow someone to make an infinite amount of tries, and remove the pause between attempts. This would allow a government computer the ability to try thousands of attempts, until they happen upon the right passcode and the phone unlocks itself.

This leads to the question, “If Apple could do this, why don’t they?” The answer is the heart of the matter, and a major issue in the field of personal privacy.

Apple could provide a software update to the government, which could be applied via the lightning port (just like you can do with the official software updates if you don’t want them to download right to the phone). They can create an update that allows the government to do what they’re trying to do. The problem is that doing so unleashes a genie that no one wants to see let loose. Putting that kind of software into even the US government’s hands means it is out there. In the same way as the government could use it to brute force crack a phone open when they have a valid warrant, anyone else who got their hands on the code could do the exact same thing with nothing standing in their way. Hackers the world over would quickly be able to break the phone’s security simply by physically getting the phone in their hands for a long enough period of time.

Basically, this is like the government asking Medico or Scalage or another lock maker to provide them with the means to create a key that will open every single lock that manufacturer ever made, given enough time and tries at it. While theoretically possible, it won’t be easy to do, and the harm it could do to millions of people would far outweigh the good it could possibly due for this one – albeit truly significant – criminal case. (Hat/Tip to Henry Martinez for that analogy)

Apple believes that this is a step beyond what they are reasonably expected to do, and the government’s requested methodology would leave millions of other iPhone users open to the potential to be hacked and have their phone data stolen. Once the code exists, someone will figure out how it is done and start using it to hack peoples’ devices in short order. The trade-off is simply not balanced enough to warrant first building and then giving the FBI the altered iOS software update.

Who will win? That’s up to the courts to decide. At this point both sides have valid legal standing and a lot of ground to stand on; but that means both sides could win or lose this one. Don’t be surprised if this goes all the way up to the US Supreme Court, as both sides are apparently going to fight this to the bitter end. Personal privacy and protection for everyone not involved in the crime versus the government’s lawful ability to gain evidence in a criminal case is not something that will be decided quickly or easily – but it is of vital importance to every one of us. Can the government demand something that could so easily be used for both their good and everyone else’s evil? Can Apple refuse to provide a software solution that is within their ability just because of the potential for it to be used maliciously? Unfortunately, current law has not quite kept up with the world of technology as it speeds ahead of lawmakers.

Either way, Apple is bent on fighting this as much and as long as they can, and either way, I think that shows a remarkable level of responsibility and care from them. I expect the government will also fight to the last breath, because the matter is critical to their ability to fight terrorism and other criminal activity. Bot sides are right, both sides are wrong, and I feel horrible for the judges that are going to have to figure this one out.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.