March 21st 2010 The iPad is Stupid, and So Are You
With the release of Apple’s iPad right around the corner, social news sites are abuzz with debate about Apple’s latest offering. The feelings among computer geeks are less than positive. I’m not impressed by the iPad, either. I see it as the crippled, unholy union of a netbook and a tablet PC, and since I think both netbooks and tablet PCs are lame, I think the iPad will be all but useless as well. But at least the iPad is more productive than all the geeks complaining about it online.
Think of the Children
Unlike Apple’s Mac offerings, the iPad follows the same application development model as the iPhone and iPod Touch. That is, iPad users must download programs from Apple’s App Store, and developers will have to sign up (which entails a $99 fee) and have their apps authorized by Apple. Because of this, many geeks are heralding the iPad as the end of open computing.
The development model for the iPhone, iPod Touch, and iPad is less than optimal, to put it mildly. I don’t like that a single company can tell me what I can or cannot run on my own machine, nor do I like having to sign up and pay to write apps. And there’s no denying the fact that Apple leverages some draconian rules to limit competition and stifle innovation in the App Store. That’s why I stick to writing applications for Mac OS X only.
Many thirty-something developers are quick to cite the Commodore 64 as the computer that got them into programming. I came along too late to cut my teeth on the C64, but from what I hear it was a computer nerd’s paradise. Applications on the C64 were primarily written in the BASIC programming language, and the C64 shipped with a stack of manuals explaining the development environment for beginners. As a result, many kids of the 1980s and early 1990s first experimented with programming on their family’s Commodore. Understandably, many adult programmers fear that the iPad won’t allow similar forays into computer programming.
But times have changed. Anyone who buys a computer today knows that PCs no longer ship with a simple BASIC development environment, nor do they include a stack of manuals on programming. And that’s because computer programming has changed dramatically since the 80s. Modern operating systems like Windows, Mac OS X, and even Linux are much more complicated than those of the 80s, and writing even a simple program is more complicated than merely typing
and executing the file (especially if you want your program to run in a window, instead of a console); if you’re on Windows, or even Ubuntu Linux, you have to find and install compilers yourself. (Ironically, Macs do ship with the Xcode software development environment, although it is an optional install.)
I doubt many 10-year-olds are experimenting with programming today in the same way that their parents did, even without the influence of Apple’s authoritarian products. And yet, geeks act as if Apple is solely responsible for destroying the ability of kids to learn to program on the family PC. Computers and programming are much more complicated than they were in the 80s. The Commodore-style programming environment was gone even before my parents bought our first computer back in 1996—and yet I still managed to get hooked on programming. Go figure.
What About Consoles?
Game consoles follow essentially the same development model as the iPhone and iPad, and yet the combined forces of Sony, Nintendo, and Microsoft have yet to eradicate open computing from the face of the earth. To develop games for video game consoles, developers have to purchase a development kit from the manufacturer, which costs at least $1700 in the case of the Wii, $99 per year in the case of the Xbox, and $2000 in the case of the PlayStation (a PS3 dev kit originally cost over $20,000, but Sony has been slashing the prices to attract developers). On top of that, Nintendo will only sell you a Wii development kit if you’re a licensed developer. And the manufacturers are the final arbiters of which games are released for their platforms (Nintendo has a history of being especially authoritarian in this regard). Compared to consoles, developing for App Store-enabled devices is a breeze. But I’ve seen very few iPad critics calling for a boycott of video game consoles, even though the consoles’ development models are even less free than that of the iPhone and iPad.
Despite the restrictions, consoles (and portable gaming devices, like the Nintendo DS) often have rich “homebrew” communities that release kits for independently developing software, as well as people who hack the consoles to remove restrictions (for example, the original Xbox had hacks that allowed owners to copy games to the internal hard drive, for reasons of performance or piracy). Of course, none of these hacks are officially sanctioned by the console manufacturers. The latest generation of consoles are able to detect most modifications and prevent the machine from accessing gaming networks like Xbox Live. Sony did allow Linux to be installed as a guest OS on the PS3, but full access to hardware from Linux was always limited, and the next iteration of the PS3 will drop Linux support entirely, due to “security concerns”. And yet, the PS3’s hypervisor was still hacked. If intrepid hackers can figure out ways to circumvent console restrictions, I’m sure they will do so for the iPad (as they have already done for the iPhone), so that doesn’t make the iPad any more or less restricted than consoles.
But no one complains about consoles. Sometimes the counter-argument is that consoles aren’t really PCs so these restrictions don’t matter, but that argument leaves out the fact that the iPad isn’t a computer, either.
The iPad is Not a Computer
Apple doesn’t market the iPad as a computer. It’s marketed as a consumer electronics device intended mostly for surfing the web, reading email, looking at photos, and watching YouTube on-the-go. It’s a portable supplement to a full-fledged computer. And despite the comparisons to netbooks or tablet PCs, the iPad is really Apple’s response to the netbook and tablet PC. Apple looked at the netbook and thought, “Why do people use netbooks? How can we make the experience better?” (Whether Apple’s interpretation of the netbook experience really is better is subjective.) Very few people use a netbook as their only computer, and very few people will use the iPad as their only computer, as well.
Geeks are afraid of a future in which families only purchase an iPad and kids can’t learn how to program, and of a future in which every computer company sells restricted, proprietary devices. But I don’t see that happening, because the iPad simply isn’t positioned as a primary computer, or even a computer at all. It’s more akin to an expensive toy for adults. In the future, families will still have at least one “real” computer in their house. Depending on its complexity, kids may or may not be able to learn how to program on that computer, but whether they do or don’t isn’t because of Apple’s introduction of the iPad.