As this is being written, a judge is considering competing briefs from Apple and the government on the question of whether Apple should have to comply with the FBI’s request to help it brute-force the passcode of an iPhone which had belonged to one of the terrorists who did the San Bernardino shootings, and are now dead. (Update, May 10: the FBI withdrew its request a while ago, saying that they had found another way into the phone. But nobody doubts that they’ll be back at some point with another phone; so this article still represents what could easily be their next move. I’ve also edited this article for clarity, and to make some points more explicit.)

At first, when considering the software changes the FBI demanded, I was sympathetic: the main point is to remove a couple of features from the code. These features are (to quote from Apple’s brief) that the code “imposes escalating time delays after the entry of each invalid passcode”, and “includes a setting that—if activated—automatically deletes encrypted data after ten consecutive incorrect attempts to enter the passcode”. Commenting out the lines of code that do these things is, for someone who knows the code, a five-minute task. Just the administrative overhead of sending the resulting code through Apple’s safeguarded signing process would be more work than that. (I don’t know the details of how that is done, and those who do know shouldn’t say; but such master signing keys are corporate crown jewels, and it is not to be expected that handling them is quick or easy.) Even that, though, doesn’t seem like an inordinate burden, in and of itself.

But then the FBI added on more requirements which make the job considerably harder, to where Apple’s estimate of the work involved (two to four weeks by a team of six to ten people) seems reasonable. One demand is that Apple give them an electronic interface for entering in passcodes, so that an FBI technician doesn’t have to sit there tapping in thousands of passcodes by hand. Another is that this new version of the software must be capable of running in RAM rather than being installed on the device, so as not to destroy any data.

What nobody seems to realize is that the FBI already has an electronic interface for entering passcodes; it’s staring them right in the face. It’s called a “capacitive touchscreen”. Such touchscreens work electrically; they send out signals (voltage) through the air to sense the capacitance of the fingers above them. Many eco-nuts would no doubt be horrified to know that Apple’s devices are sending electricity through their fingers – indeed, through their whole body – whenever they get their fingers near the screen; but it’s true. And it isn’t hard to make a circuit that connects a capacitance to a point on the screen, and which varies that capacitance under computer control. Nor is it hard to expand that circuit to several points on the screen, as would be required for entering a PIN code. (As an alternative to just varying a capacitance, the circuit might sense the voltage waveform that the screen puts out and apply a deliberately spoofed response to mimic a finger; but that is a bit more complicated and I doubt it would be necessary.) Nor is it difficult to write a piece of software that looks at the screen via a video camera and senses the difference between an unsuccessful passcode entry and a successful one. Combine those two tools, and you get an automatic PIN brute forcing machine.

Indeed, for previous versions of iOS, such a device exists, by the name of the IP-Box, though it seems to somehow enter the PIN via USB. (I am not sure how; a bit of searching makes it seem like this is not a normal feature for the iPhone.) It also has the feature of cutting the power to the phone after PIN entry failure but before the iPhone writes the record of that failure to flash memory, so that thousands of PINs can be tried. This requires that the phone be disassembled so that the battery connection can be cut and rewired to go through the IP-Box. It also doesn’t work with recent versions of iOS, which Apple has fixed so that they write the record of failure to flash memory before reporting it to the user.

So here’s what the FBI could do. First, build the above-described automatic PIN brute forcing machine. (Don’t look at me that way, FBI; yes, I know your specialty is law enforcement, not building electronics; but you at least should have enough of a technical clue to know whom to hire. Though it would help if the community of people who can build this sort of thing would acknowledge that law enforcement has legitimate needs, rather than responding in a tribal fashion. The world really does need people whose job description is “the bad thing that happens to bad people”. But these days they can’t be cavemen; they need to understand something of computing and electronics.)

The second step would be to hack Apple’s code via binary patching, to remove the two features that prevent brute-forcing passcodes. Probably they would just have to overwrite two machine instructions with NOP instructions. The hard part would be finding those two instructions, and it probably wouldn’t be that hard for a good reverse engineer (though I’m guessing here; the difficulty of such tasks can vary quite a lot, depending on things like the availability of a symbol table, and I’m very far from being an iOS expert). Having done that, they could go to Apple with a much simpler request: we have this hacked code; sign it for us so that it will run on the terrorist’s phone.

That would reduce the debate to its essence. Apple would no longer be able to argue that they shouldn’t be forced to create code that was too dangerous to exist, because they wouldn’t be creating it; the FBI would already have created it. Apple would just be signing it with their key. This is why Apple is a fair target for government demands: not because of their license agreement (an argument the FBI made that Apple’s lawyers easily brushed aside), nor because they’re the only ones who know how to program the device, but because they have retained control over the device by their control of the signing key for the operating system. Asking them to use it is a digital parallel to a demand to an owner of a storage locker: we have a warrant for this locker, and you have the key, so open it up for us. The parallel is so strong that Apple’s attorneys might well advise them not to even try fighting, but just to comply. And, for that matter, they might decide they could comply in good conscience: developing the sort of electronic interface the FBI is presently asking for, which could enter passcodes wirelessly (or via the charging cable), really does pose risks that using the touchscreen doesn’t. James Comey, the head of the FBI, has stated that he doesn’t want a backdoor into phones, but rather entrance via the “front door”; and if anything is the front door to an iPhone, it’s the touchscreen. So access of this sort, while it might not be what he secretly wants, is exactly what he has asked for.

As for the FBI demand that the version of iOS which is to be produced for them run from RAM, the basis for the modification could be a version of iOS which already does so; at least I’m under the impression these exist. Even if that weren’t possible, changing just a few bytes of the operating system in flash memory is not really going to alter the evidentiary value of the data there, though there might be problems related to legal requirements for forensic tools.

Now, if Apple signed a version of the code that just had those two changes, it would run on all iPhones of that model. So the FBI might tell Apple that if instead they wanted to make their own version of the hack which would also check the phone’s serial number and only run on that one phone, they would be free to do that instead of signing the FBI-hacked version. (The FBI has been much criticized for saying that this is a one-time request, when inevitably other requests will follow if this one is successful, but they have a point: this isn’t a warrant like the one served on Ladar Levison, which demanded that he supply his private SSL key to them, enabling them to decrypt not just the messages they originally were after but all the messages all of his customers ever sent or received. In this case any further phones to be unlocked would still have to be individually approved by a judge.) (Update, March 23 2018: The leading sentence of this paragraph is wrong: Apple’s code already has this phone serial number check, so the above paragraph is moot. See marcan’s exposition of this issue.)

All the same, the FBI could let Apple stew in their own juices a bit here. There are great risks in having a common signing key for all their phones, because if that key ever gets disclosed Apple’s control over hundreds of millions of iPhones is lost. If instead Apple had one signing key per phone, they could, on receipt of this sort of warrant, merely hand over the key for that particular phone and let the government do whatever they wanted with it. The whole drama would be avoided; it would be as routine a thing as court-ordered access to a storage locker. At present, it is as delicate as it would be for a nationwide storage locker chain which had a common master key shared throughout all its facilities, where any use of the master key would mean taking the risk that it might leak out, thus compromising their security nationwide.

In the last couple of decades, cryptographic schemes have been moving away from having a single key for everything and in the direction of having a multitude of keys. Indeed, the possibilities for mischief that a single key opens up are a large part of why everyone with a clue is scared of government-imposed backdoors: such schemes almost inevitably involve a single backdoor key, even if an attempt is made to split that key up for storage. The world has never seen the possibilities that such backdoors would give rise to; fiction offers the only parallel to them. In particular, they evoke the world of The Lord of the Rings, in which the One Ring confers invisibility and vast powers, corrupts its users, and is the centerpiece of various adventures, in which it passes through the hands of all sorts of creatures, from Gandalf to Gollum. The authorities have been remarkably creative in trying to find a replacement word for “backdoor”, calling it “a front door”, “a golden key”, and such; but whatever the word, many knowledgeable people will still think of the intention as being something like forging the One Ring in the fires of Mount Doom:

One Ring to rule them all, one ring to find them,
one ring to bring them all, and in the darkness bind them.

(And to those in the national security establishment to whom that sounds pretty cool: beware! Some hobbit may steal it and run off to Russia with it. Tricksy, those hobbitses.)

Apple has called the software they are being asked to create a “backdoor”, though it is not the traditional sort of backdoor which enables spying on running systems without the user’s knowledge. I do not feel comfortable entirely agreeing that it is a backdoor, nor entirely denying that it is a backdoor; but the weakness that makes backdoors scary is a sort of weakness that is shared by any system that puts too much reliance on a single cryptographic key.

But though going to one key per device would solve the problem of how to give law enforcement access to devices as per court order, it would not solve all of Apple’s problems. In particular, it would not much lessen the degree to which Apple is a target for espionage; a thumb drive full of keys is not all that much harder to steal than a single key. If our government were to turn brutal, too, it could about as easily confiscate the one as the other. To seriously reduce their status as a target, Apple would have to give up power over the iPhone. As things stand, you can boot only an Apple-signed operating system on an iPhone, and unless you “jailbreak” it, that operating system will only run Apple-approved software. This is a mechanism of social control which is just begging for governments to start meddling with it for their own purposes. In Android, the same limits are there, but can be turned off by the user (or, at least, that’s the way Google would have it; makers of phones can and do alter Android to lock in users). But the best example for letting people take ownership of their own devices comes from the PC world. PC users can disable Secure Boot, which parallels what can be done in Android; but they can also go beyond that, and replace Microsoft’s signature-checking key with their own, so that they themselves are the only ones who can approve changes to the operating system. Almost nobody actually does this, but it’s a welcome safety valve.

If something similar were implemented for the iPhone, again very few users would take advantage of it, but those few would be the ones who were the most concerned about security, such as investment banks, government security agencies, spies, and terrorists. Even with the vast majority of users still depending on Apple for their security, having the highest-value targets opt out would significantly lessen the degree to which Apple’s signing keys were a target for espionage and for government demands. It is really not fair for one corporation to have to bear such a large fraction of the world’s security burdens, nor should Apple try; they should release some of it to those of us who are willing to shoulder it ourselves. That way they could actually be, as their brief to the court claims them to be, merely a supplier of products and services, not a controlling master.