The Risk to Apple’s Code-Signing Key

One thing that gets glossed over in the Apple encryption discussion is whether Apple can or cannot make the requested software that it can be used only once. The problem is it’s not a simple yes or no answer.

Technologically, Apple can absolutely make software that will only work on the one phone. Steve Gibson has an excellent explanation of that on Security Now.

“If Apple complies with this case, there would be no risk of “leaking” anything “dangerous”, at least not any more than there is today of Apple’s private key leaking.“

But some experts believe that key is where the problem is.

Bruce Schneier writes “They would need to have stolen Apple’s code-signing key so that the phone would recognize the hacked as valid, but governments have done that in the past with other keys and other companies. We simply have no idea who already has this capability.”

A fair point. But maybe he’s wrong. Maybe, and hopefully, Apple has not lost control of its key. The question then becomes could this case make it harder to protect the key

The EFF thinks so. “If the government begins routinely demanding new phone-specific cracking software, this could overwhelm the security of this process by requiring many more signatures. This is another valid reason why Apple is right to fight this order.”

Also the court processes for validating an ‘instrument’ like this puts the code through many more hands, meaning more risks for the key to get out. The risks are lined out by Jonathan Zdziarski

To create a forensically sound tool that would hold up in court, it must be peer reviewed and validated by third parties.

But even then the key can be protected. Lets assume, optimistically, that even with multiple agencies handling the software, the key remains uncompromised because best practices are always followed by everyone involved.

The risk gets greater as more people handle the code. And more people will handle the code if these kinds of request were to become routine.

The best summary of this issue came from Susan Landau in her testimony to Congress.

“The FBI statements that the update will be under Apple’s control and can be tied to work only on Farook’s phone are factually correct. But they miss the point of the risks involved.

She alludes to the risks that Zdziarski illuminates and also expands on the risk of this becoming a routine process if law enforcement regularly needs to break into encrypted phones.

“All it takes for things to go badly wrong is a bit of neglect in the process or the collaboration of a rogue employee. And if the FBI, CIA, and NSA can suffer from rogue employees, then certainly Apple can as well.”

So there you have it. Technically the FBI is right. Software can be made that will work only one time in this one case with no danger of causing harm to other phones.

The question is then how often you believe the process would happen and how well Apple can protect its key in that case.