Recent Blog Entries
Beware Chrome and HTTP/2 Debugging April 2 2023
It’s About Security, not Privacy Feb. 26 2016
Technology Marches On Feb. 18 2016
Bitcoin: Where is the Governance? March 3 2014
Bitcoin March 1 2014
Let me start this post by asserting that I have no formal affiliation with Apple Computer. I will be discussing facts, but whatever I say about Apple's motives will be by way of speculation.
Now that we have that out of the way...
The Apple iPhone makes extensive use of public key signatures to permit only “signed” and approved applications to run on the phone. In fact every executable starting with the firmware bootstrap must be signed. So the hardware ROM checks the signature on the bootstrap. The bootstrap checks the signature on the operating system load and the operating system checks the signature on each application.
This is not a new idea. I remember discussions with people from the Digital Equipment Corporation (DEC) about the notion that they had of having boot ROMs check the signature of the operating system before loading it. They didn't take it to the level of signing applications, but they wanted to ensure that only a “trusted” version of the operating system could be loaded. This trusted version would then be trusted to make assertions about who was logged in to them. So the motivation was slightly different, but the idea of using digital signatures to ensure that only a known and trusted version of code was running was there.
What the iPhone does is quite interesting from a security perspective. If you only run signed code (be it the operating system and applications) you have a very strong defense against various forms of malware such as tojans, worms and viruses. They cannot run because they are not signed. Similarly if somehow an application is corrupted (either by accident or by some malicious activity) its digital signature will become invalid and the iPhone OS will not let it run.
The iPhone also offers some other interesting security features. One of the security important Application Programming Interfaces (API) on the phone is the “keyring” API which permits applications to securely store sensitive information.
What this is means is that the iPhone has the real potential to offer not just a trusted operating system, but a trusted platform. A platform that can handle sensitive information, be free from malware and you can carry in your pocket!
But then there is jailbreaking.
Jailbreaking is all about subverting all of this careful digital signature checking so that any application can be run on the phone, including those without Apple's sanction.
But if the iPhone is so secure, how is this possible? The simple answer is that all of this signature checking is very powerful, however the implementation is fragile. In the case of the iPhone there is a buffer over-run in the low level bootstrap ROM (fixed in the 3GS, but I'll leave that for another post). This over-run permits a carefully constructed binary to cause the ROM signature check code to always say “yes” regardless of what it is given. Using this hole, we can now load a version of the iPhone OS that has been patched to cause its signature verification code to also always say “yes” and the game is afoot!
Fortunately jailbreaking requires physical access to the phone. It cannot be done remotely (that we know about). More importantly it cannot be done to over the air to the phone of someone who isn't interested in having their phone compromised. However it means that we cannot assume the iPhone is a trusted platform for the storage of sensitive information. Because if someone's phone falls in the wrong hands, they can jailbreak it and then extract any information they choose.
So I speculate that THIS is the reason why Apple keeps on attempting to make jailbreaking harder (or impossible if they get their wish). Not because they don't want people jailbreaking their own phone, but to protect the phones of people who loose them. So that the iPhone can become a trusted platform.
As long as a stolen phone can be jailbroken, the iPhone cannot evolve into a truly trusted plaform... and that is the rub.
This has some interesting implications for the iPad... but that will be a future post.
Comments:
From: Mike H.
Like. I've always been interested in seeing where the fundamental incompatibility between the GPL's copyleft scheme and the iPhone Developer Program License Agreement ends up leading, as they're completely dissonant. While I love my iPhone, I'm always quick to voice that it's troubling that any freedom users should have to modify and share their software is secondary to the legal and paramount requirement of observing and protecting Apple's DRM system... backed by the chilling and stifling DMCA.
From: Jeff Schiller
Here I am commenting on my own post!
Although a non-jailbroken iphone has a powerful security barrier in the requirement that applications be digitally signed, it should be mentioned that the signature is only checked when a program is loaded. If a running program, say Mobile Safari, is compromised via a buffer over-run, the attacker can gain access to any information that the userid of Mobile Safari (in the case of the iPhone is "moble") has access to.
This is actually an argument for sandboxing complex applications like Safari which are likely to have security problems in their implementation. If Safari was sand boxed, so it can only access files on the iPhone that it needs to, then exploits against Safari (and Webkit and friends) would be a lot less effective.
From: Phil Earnhardt
Jeff: sandboxing Safari would seemingly do little to address the real risk of a compromised browser: leaking accounts/passwords. A compromised browser would see these credentials in plaintext and could easily open a connection to another website to transmit that information.
I'm starting to believe the only means of protecting credentials is a physically separate challenge/response box where there is no means to update the code in the box after it has been manufactured.
I am impressed with the firewalling that's in place between the data stores of applications. Just like the digital signatures, that will have a damping effect on any sort of malware. At the same time, it's very useful to allow some exchange of data between applications -- forcing users to bounce files off of the cloud to exchange them between apps is rather silly. Slowly relaxing very strict security measures is far easier than attempting to add security to an overly promiscuous system.
I'd like to see you address: what is the value in having a portable device whose integrity is constantly verifiable? Is such a device possible? Biology seems to have never come close to that kind of standard.
Copyright © 2009-2023 Jeffrey I. Schiller