Friday, April 9, 2010

Why the iPad is Good for Security

A few weeks ago a rather chatty fellow sat next to me at the coffee shop where I was working and said: "How do you like your Mac?" I replied. He then said "I hate Macs." I told him that I think people should use whatever computer operating system they find useful. I mentioned my area of research, computer security, which piqued his interest. He said "I don't have any antivirus software on my Windows PC and I don't have any viruses." He assumed that he had no viruses because there was no evidence (nothing crashed or disappeared). He went on to report that his computer runs pretty slowly (which I found quite humorous). I told him that he didn't have *a* virus, that he actually had *many* viruses. I explained a little about botnet zombies to which he replied "I just want to read email and watch videos." He didn't seem to care in the least bit that his privacy may be at stake or that his computer may be participating in computer crime.

A research project that I am part of, Poly^2, investigates the idea of increased security through the use of specialized operating systems. In short, the idea is that we could tailor make OS's for specific tasks. The idea isn't as simple as merely turning off unused network services (though that is a good idea in general). It goes further. It tries to restrict the primitive functions of the OS (such as memory access) to the bare minimum needed to carry out the specific task. Those who have studied information security may recognize this as the "principle of least privilege". General purpose OS's defy the principle of least privilege, especially in the context of consumer-grade computers.

The iPad isn't necessarily a realization of the full Poly^2 ideology. However, I think they are related. If Joe Blow just wants to "read email and watch videos." what options does he have? He could buy a standard PC (from here forward PC refers generally to personal computers, no OS is implied) and patch it every six days. However, the act of patching a computer is distinctly not reading email or watching videos. Should Joe be able to read email and watch videos without additional responsibilities? It seems like a reasonable desire to me. Joe isn't required to patch his car even though it likely uses a microprocessor.

Botnets are a huge problem. Some botnets, like Confickr, control millions of zombie PCs. The zombies are made up of unpatched PCs. Many of them are likely owned by people like Joe who just want to consume information. If all of those people, who don't require a general purpose OS, were to buy media consumption devices (MCDs) such as the iPad, instead of PCs then we would likely see a dramatic reduction in botnet zombies.

Most of the criticisms I have seen of the iPad revolve around the assumption that it is a PC. It is not a PC. If you are comparing it to a PC, then yes you will likely be disappointed. I heard someone say that they didn't like it because it wouldn't run MatLab. If you want to run MatLab or Photoshop you should not buy an iPad. Some have criticized the iPad and iPhone because of their closed nature. I haven't developed for either, I prefer Android myself, so I don't know first hand what is required. However, as far as I can tell their APIs are available and they allow you to program in open standards programming languages. Will the iPad have security vulnerabilities? Of course! However by carefully controlling what applications can be created with and how they can be distributed, Apple can strongly influence and remedy future vulnerabilities.

Is the iPad for me. Probably not, I am not Joe. It may however be a good media consumption device for my wife.


Ryan said...

Oh no here comes the flame! I'm not sure why you mentioned the Mac's-don't-have-viruses thing if you willingly acknowledge that it's only because Macs don't have (or possibly even want) the market penetration.

I think your overall assumption is that there will always be Macs for you and MCDs for Joe, and that the iPad represents only an additional specialized device added to the variety in the technical world of gadgets and computers.

Let me be first to say that I do not believe this is Apple's vision. I observe two things:

1. Apple seems to be the only company competent enough to produce computing devices that don't suck.

2. Apple has a totalitarian approach to their iPhone/iPad platform.

If either one of these things weren't true, I wouldn't be worried. But since they both are true, it's easy to see a future where Apple makes a whole lot of money, Joe gets to consume his media, and all those people who want to write applications for individual consumers must bow down and kiss the floor.

Chris Howie said...

"However, as far as I can tell their APIs are available and they allow you to program in open standards programming languages."

Sort of. You can program in the languages *they* want you to. See

This could spell the end of projects like MonoTouch and Adobe's Flash->iPhone compiler.

It's one thing to vet apps for conformity to your UI design guidelines. It's another to say "you cannot use any abstraction of our platform."

So, "open" only in the sense that you can make use of the tools Apple gives you without restriction. Well, actually no. You can't "duplicate functionality" or do a number of other things. Or distribute content Apple has a moral stance against. (Even though Safari presumably has access to similar content.)

Uh huh. And people wonder why I don't like the iPhone... it's not the iPhone, it's the narcissistic control-freak company behind it.

I'm sorry, I appear to have digressed all the way on to a soap box. :)

To make this comment more on-topic, I think this is one of the major reasons why Linux is not targeted as successfully by viruses and worms -- there are just too many distros, too many environments, hell there are even many ways of organizing the filesystem. There's just not a lot to depend on besides "/bin/sh is executable." You could also argue that the kernel itself is more secure, but this is only half the story, since what you run on the OS (like Apache) can introduce more holes. Another argument might be market share; OS X has far less than Windows for sure, so it's not as lucrative a target.

I'm sure all of these play a role. How significant each is I will leave to someone with more knowledge than I.

humanedesign said...

I just wanted you to know I read this :)

D. M. Stanley said...

Thanks for the comments everyone.

@Ryan: I removed the bit about OS X as it didn't add much to the post. Is it your assumption that Apple will do away with PCs long term in favor of MCDs? I don't think that is likely. After all some people must run Photoshop and MatLab.

@Chris: It seems reasonable to me to restrict languages used to those not requiring VMs or interpreters to save on resources. 12 hour (real reports) battery life is pretty good. I realize that this will be irritating for people who prefer other languages. But the languages themselves are pretty standard.

Ryan said...

Apple already did not allow interpreted or vm-based code on the iPhone. That is not new, and MonoTouch apps do NOT run on a VM. What Chris was pointing out was a new and recent restriction which outlaws all apps written in languages other than C, C++, and Obj-C and cross compiled to the iPhone. This goes way beyond efficiency concerns.

Some people think that Apple will evolve away from open Macs toward their iPhone OS based devices since they are clearly becoming the cash cow. I'm not sure about this, since there will always be a need for a developer machine. However, their commitment to openness has been shown to be nil, so whenever it is even remotely in their interests to lock down their devices, they'll go for it. They've shown that they really don't care at all about developers. Their strategy (which has worked great so far) is to build things people want, and developers will flock to Apple practically begging to be whipped.

Chris Howie said...


MonoTouch apps do run in the Mono VM, and are garbage-collected etc. Since the iPhone OS doesn't allow dynamic compilation, all the methods must be AOT(Ahead Of Time)-compiled to native code, and this is no different in terms of performance than if the methods were JIT-compiled, except for the reduction in startup time.

Perhaps you are confusing AOT compilation with fully native code.


Battery life is a secondary concern to functionality. Consider the case of games that run on the Unity3D engine, which is in turn Mono-based. People don't expect great battery life when playing 3D games. And yet these games could be removed from the store because they weren't written in the Apple-approved languages.

That's about as closed and malignant as you can get. The fact that these approved languages are standardized is beside the point.

Ryan said...

I'm not really seeing the difference here with MonoTouch. Some kind of technicality? If it's fully compiled to native code, then it's fully compiled to native code. Is it the garbage collection pauses? This is how they get around the requirement that there be no vm interpreted (or straight up interpreted) code running on iPhone.

Chris Howie said...


Mono has never executed code by interpretation, not since some prehistoric release. If that disqualifies Mono as a VM, then the JVM is equally disqualified, since it has performed JIT for ages.

When it comes to environments like Mono/.NET and Java, the line between virtual machine and native compilation is somewhat blurred, since these environments translate methods from their respective bytecode language to native code at runtime (the first time a method is entered). There is still a lot of governance performed by the runtime, such as GC and stack unwinding from exceptions, as well as some other features like reflection. Whether that is enough to justify the label of "VM" is purely academic, since Mono behaves very much the same way on a desktop platform.

The AOT process that pre-"JIT"s the method bodies is only a technical requirement to circumvent the security policies on the iPhone that prevent the execution of writable memory pages, which is what makes JIT possible. And even with AOT, the IL assembly is still required for class/struct/enum/etc information; the method bodies are simply ignored and taken from the AOT output instead.

Ryan said...

JIT is not allowed on the iPhone. I never said anything about JIT. In fact, the lack of ability to JIT compile combined with the lack of ability to interpret bytecode makes this quite different from what people usually think of when you talk about a language VM. Am I mistaken in assuming that Mono does both of those things on your proverbial desktop? It seems like MonoTouch is a handicapped version of Mono at best.

Would you say, as Dannie does, that MonoTouch apps incur the same natural VM-language resource hogging as though they were standard VM-based apps? This might be true depending on how the native code is generated. If that is the case, then do you suppose Apple's move is motivated by performance?

Ryan said...

There is some interesting commentary on this new restriction from Apple on LtU (PL researchers).

One interesting comment that I saw was the following, by John Stracke:

"Computers just have to be general enough for almost everybody. If locked-down machines like the iPad become the norm (which has got to be Apple's long-term goal), truly general machines will be pushed into smaller and smaller niches. Eventually, if you want to do PL research, you'll need a machine 10 or 100 times as expensive as what you can get at Best Buy."

I think that the "hey you don't have to buy an iPad" argument ignores the economic effects of potentially game changing devices like the iPad and the new technology era that it heralds. I see the iPad as perfect for the majority of people. How many people are really going to need a general purpose computing device? If this is indeed the case, then where are we general purpose craving geeks going to find ourselves when our preferred hardware is no longer a commodity? What programming languages and tools are you going to be allowed to use when you get hired at a company and they hand you a special purpose, single-platform programming device.

Chris Howie said...


"JIT is not allowed on the iPhone."

I've already said this twice.

"In fact, the lack of ability to JIT compile combined with the lack of ability to interpret bytecode makes this quite different from what people usually think of when you talk about a language VM."

Perhaps. This is why the line is a bit blurry.

"Am I mistaken in assuming that Mono does both of those things on your proverbial desktop?"

Yes you are. Mono does not have an interpreted mode, only a JIT mode. And you can AOT assemblies for use on desktops as well -- so is Mono not a VM if you are using an AOT-compiled program on the desktop either?

"It seems like MonoTouch is a handicapped version of Mono at best."

The removal of JIT compilation and runtime IL emission is not a very big handicap.

"Would you say, as Dannie does, that MonoTouch apps incur the same natural VM-language resource hogging as though they were standard VM-based apps?"

Yes. The only difference is that the method bodies are converted from IL to native code ahead of time. The native code produced is exactly what the JIT compiler produces. The only difference is that the startup time is reduced since the methods are precompiled; when using JIT, you will get identical performance calling a method if it has been called previously (and hence JIT-compiled) as you would if the assembly was AOT-compiled. The compiled native code still contains the same calls into the VM to perform various actions at the same places it would otherwise, GC is still performed, etc.

"If that is the case, then do you suppose Apple's move is motivated by performance?"

For the kinds of apps you usually see on the iPhone, which are not terribly CPU-intensive, Mono apps perform at about the same level since there is just not very much going on under the hood.

Also, considering that nobody has been complaining about the performance of games that use Unity3D (which uses Mono to run all code written by the game developer) no, I don't think performance or battery life has anything to do with it.

Ryan said...

Very interesting. I've always thought that most VMs did bytecode interpretation with some JIT mixed in at the same time.

Could it be that I've been a fool this whole time?

Chris Howie said...


"Very interesting. I've always thought that most VMs did bytecode interpretation with some JIT mixed in at the same time."

A lot of runtimes used to, I believe, back when JIT compilation wasn't as optimized and both took longer and created less efficient native code, so it would only be used in cases where it would actually improve performance. Now that the algorithms are better and faster, there is little reason not to JIT everything.

Note that the optimizations performed often go far beyond simply turning bytecode into native code, and can also include inlining of methods (for example, in the case of simple setters/getters).

As a side note, JIT can actually make some things slower, but these usually relate to exceptions. For example, for Mono to throw a NullReferenceException it must catch a SIGSEGV and do some inspection to determine exactly why it received that signal. This is a lot more complex than looking at the stack when call/callvirt/ldfld is processed.

"Could it be that I've been a fool this whole time?"

Or I have for calling something that doesn't do interpretation a virtual machine. Who knows.

Either way, I don't think Apple's motives have anything to do with performance. :)