The Next Round of the FBI vs. Apple Encryption Debate is Game Over For Security
Greg Leffler
Director of Developer Evangelism at Splunk. Former SRE Leader and Editor at Large at LinkedIn.
Apple and the FBI are currently engaged in a contest to figure out who has control over the phone in your pocket, and the consequences of this contest could shape individual privacy rights for decades to come. As someone who is passionate about personal privacy (and who donates regularly to the EFF), reading the news that a court granted an order requiring Apple to weaken the security of their products to collect information that can’t possibly be relevant to putting someone in jail (the people responsible for the crime in question are dead), I was chagrined.
For what it’s worth, Apple’s open letter about this is very cogent and goes a long way to explain their position and how complying with this order (regardless of the merits of the order, which I won’t argue) could have dire consequences to the security and privacy of every iPhone user worldwide. I’d strongly suggest reading it for background about what’s being asked and what Apple’s position is.
To understand why Apple has this position of intransigence, you have to know a little about how the iPhone stays secure. For the model in question (and older models), most of the security features are controlled by the software running on the device. If you replace the software running on the device, you can have more control over the security features. In the case of this specific iPhone, the security features are pretty well implemented, so there isn’t software that can be installed that will automatically unlock the phone instantly.
However, all of the security protections in any device can really only be as strong as the password that protects them. To try to protect against the fact that knowing the password means you can unlock the device, and that it’s possible for someone nefarious to guess the password, Apple’s software implements some controls over how many times you can guess the password, and optionally has a setting to erase the phone’s contents if the wrong password is entered too many times. It is this software the FBI wants Apple to disable for the phone in question.
As a sidebar, for modern phones (pretty much everything after the iPhone 5C in question,) the hardware itself does a lot of the checks that the FBI wants Apple to disable. Because of the way the password is verified and the data is secured on the phone, it is impossible* for the software to disable the automatic erase or the escalating time delay between incorrect password entries.
It should be clear by now that whomever controls the software on the device (in these older devices) controls the security of the device. Apple is the entity that controls the software on iPhones. If you want to build your own version of iOS and compile it and distribute it, you can do all of that, but it still won’t run on actual devices. This is because of digital signatures. A digital signature relies on some complicated mathematics that amounts to this: Factoring large numbers is really hard (what 2 numbers greater than 1 million can you multiply together to get 151,943,469,098,107,730? Go ahead, use your calculator), but the reverse — verifying that some numbers multiply together to a large number (what’s 389,417,839 times 390,181,070?) — is easy. (Yes, you could probably factor 151 quadrillion very quickly with Wolfram Alpha or something similar. Try doing it with a number that’s 1,200 digits long!)
For the record, this is the same way encryption works, which is what makes it safe to put your credit card number on the Internet. iPhones (and most other devices that use this signing technology) rely on something called “public key encryption”. The way this works is that there is a public key, which is distributed widely, and a private key known only to the person who we want to be able to sign (or decrypt, as appropriate) the messages.
What do digital signatures have to do with iPhones? Everything. When you turn your iPhone on, it starts running some software that is physically stored in read-only memory on the device. This software then reads the flash memory with iOS (the software that makes your iPhone your iPhone), and gets ready to run the software. Before it actually starts that software, though, it checks the digital signature of that software and makes sure that it was generated by Apple (i.e., it makes sure that the signature was made with the private key that matches the public key that’s burned into the device). If it doesn’t, the software won’t run.
So, even if the FBI were to create their own version of iOS that bypassed the security protections and made it easier for them to try to guess the password, they would still need Apple’s help to get it signed so it could run on actual iPhones. It is impossible* to digitally sign things that can be verified with a public key without having the corresponding private key.
To fully understand why I’m telling you all of this, you need to understand one tiny bit of knowledge about what legal basis the court is using to compel Apple’s help unlocking the phone. It’s called the All Writs Act, and what it basically says is “courts can order anyone to do anything”. Why is this relevant? Well, think about what you now know about how iPhones stay secure. They rely on Apple producing secure software, and then the hardware enforcing that only software Apple wrote can execute on the device. The way that the hardware enforces this is by checking that the signature was made with the right private key.
So let’s assume that the FBI loses Apple’s inevitable appeal, and Apple doesn’t have to develop their own version of iOS that makes these older iPhones insecure. The FBI still can compel Apple to hand over the source code for the software on their devices. It can compel them to give them the build tools and other information necessary to make sure that software works. It can probably even compel Apple to tell them exactly what in the software to change to make it less secure so that it meets the FBI’s needs. Finally, and most worryingly, the FBI can compel Apple to give them Apple’s private key, so that the software the FBI creates can operate on an actual iPhone.
Once the FBI has Apple’s private key, so does anybody successfully spying on us. So does every other country who can strongarm Apple into doing the same thing now that there’s precedent for it. Any device that was sold with the corresponding public key burned in it (which can’t be changed) is now forever insecure. Anybody with the key could produce software that masquerades as being from Apple. The personal privacy and security of your device could never be guaranteed again. It really is "Game Over" from the security perspective.
Think long and hard about whether catching a bad guy who we’ve already caught is worth the potential risk of nobody anywhere in the world being able to trust the device that carries around their life. The iPhone is one of the most secure computers ever made (assuming you trust Apple), and trying to break that open has real consequences.
* “impossible” given our current computing technology and understanding of mathematics. Nothing is really impossible.
#UnlockiPhone
Looking for a job! I am available Tuesday through Sunday’s ( morning & evenings) Mondays and Wednesday I attend school. The College of health care professions, to become a pharmacy technician.
8 年Good article, but really it is not Apple's right to do FBI's work. FBI should be able to hack into phones if they want a specific information from somebody. I highly do understand that Apple has to do it for the FBI because they ask for but it is private information for them.
Looking for a job! I am available Tuesday through Sunday’s ( morning & evenings) Mondays and Wednesday I attend school. The College of health care professions, to become a pharmacy technician.
8 年Good Article. But really it is not Apples job to do the FBI's works as well it is not right for them to hold back information the FBI needs in other to do the right thing.
Broadcast Media Professional
8 年I do not have an Iphone as I would not know how to use one last year I was sent one wich mentiond send back after a week I had to send it back after a couple of days I had no idea how to use it I do not use credit cards nor direct debit ither I do not do on line banking
Sales VP and Artisan Coffee Roast Master at Spice Isle Coffee / The Grenada Coffee Company
8 年I disagree. While the FBI has proven themselves in the last week to be corruptible, I don't think Apple having control is better. In the end we still have to deal with terrorism, and sometimes preventing a terrorist attack, finding a kidnapped victim, and on and on, trumps all. If it were your child, wife, mom, or whatever that was threatened you would think otherwise. The worst part, and I don't know who was responsible, was the publicity. Also, Amali should read a few history books before making racist comments.
Sr. Talent Acquisition Partner/Recruiter : Majors, Enterprise, Commercial, Federal, Public Sector, IT
9 年Greg - great article. While I consult with a cloud company, I am on the marketing side and don't understand the technical aspects of it as well as others, but I do understand this - once our privacy is handed over, there is no going back. As you said - game over! The CEO of the company I consult with just wrote an article about it (also here on Pulse) and he has a unique position on how to roll with encryption issues. It's here if you'd like to read it - https://onforb.es/1LeM0gR.