all that’s not golden
Several stories and events recently that in some way relate to backdoors and golden keys and security. Or do they? In a couple cases, I think some of the facts were slightly colored to make for a more exciting narrative. Having decided that golden keys are shitty, that doesn’t imply that all that’s shit is golden. A few different perspectives here, because I think some of the initial hoopla obscured some lessons that even people who don’t like backdoors can learn from.
Microsoft added a feature to Secure Boot, accidentally creating a bypass for older versions. A sweet demo scene release (plain text) compares this incident to the FBI’s requested golden keys. Fortunately, our good friends over at the Register dug into this claim and explained some of the nuance in their article, Bungling Microsoft singlehandedly proves that golden backdoor keys are a terrible idea. Ha, ha, I kid.
Matthew Garrett also has some notes on Microsoft’s compromised Secure Boot implementation. He’s purportedly a Linux developer, but he doesn’t once in this post call Windows a steaming pile, so he’s probably a Microsoft shill in disguise.
Returning to the big question, What does the MS Secure Boot Issue teach us about key escrow? Maybe not a whole lot. Some questions to consider are how thoroughly MS tried to guard the key and whether they actually lost the key or just signed the wrong thing.
Relevant to the crypto backdoor discussion, are the actions taken here the same? In a key escrow scheme, are iPhones sending encrypted data to the FBI or is the FBI sending encrypted messages to iPhones? The direction of information flow probably has a profound effect on the chances of the wrong thing leaking out. Not to say I want anything flowing in either direction, but it does affect how analogous the situations are.
A perhaps more important lesson, for all security or crypto practitioners, is just barely hinted at in mjg59’s post. Microsoft created a new message format, but signed it with a key trusted by systems that did not understand this format. Misinterpretation of data formats results in many vulnerabilities. Whenever it’s possible that a message may be incorrectly handled by existing systems, it’s vital to roll keys to prevent misinterpretation.
Remember that time moxie defeated SSL by putting null bytes in certs? Very similar flaw. Data must be interpreted identically by the signer and the verifier.
(For another trivial example of this flaw, DKIM signed email typically uses a relaxed hash of the body, which means it’s possible to reflow the text and the signature will still verify. For text as text, this may not matter, but it clearly makes a difference for ASCII art. Someone may think they’re being clever sending a QR code as ASCII art, but a malicious third party can rearrange the text to create a different QR code with the same signature.)
Apple crypto vault
At Blackhat, Apple presented their security infrastructure. Naturally, the first question to ask is, Is Apple’s Cloud Key Vault a crypto backdoor? Note that Prof. Green found an iMessage flaw, so he’s probably an Apple shill in disguise.
How does this relate to the going dark debate? Apple seems to have focused on building a system that nobody can tamper with. That’s rather different than a system which permits the FBI to tamper with it. Like storing secrets in a safe, then welding the door shut. Another rebuttal dissecting exceptional access and false equivalences.
nick left an interesting comment which dissects some of the remaining dangers. I don’t agree with all of it, but it’s informative. However, I think calling something a backdoor should involve assessing two factors. Access for who and how.
A backdoor, as I would define it, permits access to someone that they would not normally have. So, the first question is who is the Apple backdoor designed to let in? The second question is, how do they get back in? If Apple were trying to create a backdoor, it seems they could have made it a little easier to get back in.
Equation Group leak
While we’re on the topic of jumping to conclusions, attribution based on disassembled hex constants may not be the most accurate technique.
Some more links about government policy and the zero day exploit market.
Rob says national interest is better served by exploitation than disclosure. This is more of an economic argument than an ethical one. I’ll refer again to Matt Blaze at Hope XI. One idea he floated, which is not to say suggestion, is that it may be preferable to continue having the government buy and develop exploits for naturally occurring vulns rather than deliberately introducing backdoors. Another argument we should wait to build policy.