Having provided an overview of the iPad and Chromebook in general, Recurity Labs’ FX now focuses in detail on Apple’s approach to security, integrity protection and crypto.
Let’s talk about Apple in more detail. So iPad security architecture – nothing new, I guess. It’s a standard XNU (Mach+BSD) kernel. There’s only one user – the user ‘mobile’, that runs all the applications. The root user has a fixed password, we all know that. There’s additional kernel extension called ‘Seatbelt’ that essentially hardens the operating system a bit more. You have binary code signatures in the actual binaries, so they can make sure that they know the binary they’re executing. You have the keychain – central storage for user credentials; and ASLR (Address Space Layout Randomization) and DEP (Data Execution Prevention), whatnot.
The integrity protection, and this is similar for both devices and this is something you will see on pretty much any handheld client platforms, is they now have trusted boot. What trusted boot does is, at first stage it verifies the second before starting it, the second verifies the third before starting it, and so on. The iPad and the iPhone do the same thing, so they have the ROM, and they have the next bootloader (LLB), and the next bootloader loads the iBoot, and the iBoot loads the kernel, and the kernel loads everything else.
What’s kind of bad is if you actually have vulnerabilities in the bootloader, the first stage, then this whole trust chain is kind of broken. So for recovery purposes the bootloader would for example use something that’s called DFU – the Device Firmware Upgrade, I guess, or the ‘Device Fuck U Mode’, which you can turn on and then load the second stage over USB. It really-really sucks if you have a Buffer Overflow in that process, because then everyone can load whatever they want over USB, which of course everyone does for the jailbreaks. That essentially means, if you are losing sight of your iPad even for a very short amount of time, someone can jailbreak your iPad and do whatever they want with it.
If you do that, this issue is involved with that if you don’t understand the code, you are baking it. So a short explanation for those of you who don’t know the details about X.509: the idea with that is everyone can sign other stuff, but at some point you don’t want to give out a certificate to someone, like when you send out a certificate request for VeriSign, saying: “I want this key to be signed for my web server, for SSL”. VeriSign obviously doesn’t want you to be CA afterwards, so what they give you back is not something you are supposed to use to sign other people’s certificates. This is called ‘basic constraint’, so there is a flag in the certificate that’s signed, that says: “This certificate can be used to sign other stuff” or “It cannot”.
So Apple actually managed to not see that point. iOS actually ships with a signed certificate, where they also ship the private key, it’s on your device. Everyone has an Apple-signed certificate and the private key to that. This is used for the push messaging which actually needs to authenticate, so you have the private key and you have a certificate on it. So we looked at it and we found out that the bootloader actually doesn’t verify the basic constraints. So we could use that certificate that is on the device to actually sign firmware that’s been booted. However, unfortunately for us, the bootloader is coded so crappily that the certificate chain cannot be longer than three steps, because they didn’t actually code it in a loop, but they actually coded like “Check this”, “Check that”. So unfortunately, three is too short for the certificate that you have on the device.
However, we then turned around and said: “Well, how about userland?” Okay, did they do the same mistake in userland? It turns out – yes, of course they did. So essentially, this private key that you have on your device you can use to sign arbitrary sites and other X.509 content, including HTTPS certificates or email server certificates, or VPN stuff, or whatever. So SSL was inherently broken on iOS until version 4.3.5.
We told Apple, and we’re like: “Oh my God, now they have to backport this to all the devices they ever made, because there are people who do mission critical shit on iPhone 3G”. Well, see this: Windows XP is over 10 years old now, you’re still getting security updates. Apple fucks up SSL completely for all apps, for all browsers, everything on the iPhone and the iPad, and they are just shipping fixes for the latest version – if you don’t like your SSL to be broken, go buy a fucking new iPhone! That is so ridiculous, but that’s how they do it. So if you still have an older device, you might want to check out the https://issl.recurity.com website which will give you a nice little lock and tell you that everything is fine although we have an Apple-signed certificate for the domain ‘*’ that we’re running on this web server.
Speaking about doing crypto and doing it wrong, Apple also has a signature scheme on the binaries, and this is all well known. Essentially, they sign the binary with what they call the ‘Mach-O’ signature. It’s a really smart idea to go ahead and build a signature mechanism that actually signs parts of the binary individually, like the code section and the data section, and the data read-only section, because they believe there will never be any metadata that is important for execution. Of course there is, there’s a bunch of exploits that are used mainly in jailbreaks. Essentially, what it all comes down to is you always take metadata; the easiest one, the first one was, you know, what they don’t sign – the entry point. So the header in a file says: “This is the entry point”, and of course you can change that value and the signature is still correct. So you just slap some stuff on the back of the binary and say: “Here’s the entry point”, and then it jumps there, executes your code, and everything is good again. This is all known – that it’s essentially unfixable, the whole process is pretty much unfixable.
Now, the update story is also something that’s fairly interesting when you talk about doing your client platform. Apple’s update story basically means AppStore, and nowadays they can also do over-the-air updates but we really haven’t looked at them yet. AppStore has its own story about it, we’ll come to the AppStore later, but keep in mind that in their review process nobody knows what they are actually reviewing. This also applies to security updates. So if I find a vulnerability in your iPhone app or in your iPad app, and even if I’m a nice guy and I tell the publisher, like “Here, you fucked it up, go ship a fix”, the fix can take up to four weeks, because someone in the AppStore review team was just sitting on it. There is no way to publish, republish an app and, say, flag “This is a security fix, ship it fast”; it’s the same process as if you updated the content, like you changed an image or whatever.
What they also do is over-the-air mobile phone carrier updates. They are pretty interesting, we haven’t looked into the details yet; they’re supposed to be signed somehow, but we’ve seen how ‘well’ signatures work on Apple in general. And they can set things, like they can set your proxy – pretty useful thing to do, especially if you want credentials that the user has.