When I got my first Mac, it came with the Macintosh Intro, a disk that held a little tutorial explaining how to use various parts of the Macintosh. Among the topics covered was what a mouse is and how to use it (and even that you can lift it off the table and put it down in another spot to have more space to move in a particular direction).
As he said during his AllThingsD interview with Walt Mossberg, when someone suggested including a touch-typing tutorial in this intro as well, since many people did not know how to use a keyboard, Steve Jobs simply said not to bother as “death will take care of that”.
When you look at today’s Macs, it appears this has already happened. Not only is there still no keyboard tutorial, the mouse tutorial is gone as well. Heck, you don’t even get a nice little character in an Apple sweater grabbing the menu bar with his hand and pulling out a menu, or zooming and closing a window. It is assumed that everyone today knows what a window and a menu are, and how to use them.
Which isn’t that far from the truth. Children today see other people using a computer and a mouse all day long, be it on the bus, in bank offices, stores or when watching their parents buy plane tickets for the next vacation at home. Their parents answer their curious questions, and they probably even “play computer” with cardboard boxes. In most high schools, students are taught the basics of computer use, even up to writing Excel formulas. Typing and basic computer usage is a necessary, ubiquitous skill today.
One common problem, both on the Macintosh as well as on the iPhone, is that current generations of users are having big problems understanding the concept of an application. You see this in app store reviews that complain to a third party developer about the cost of an application, and that this should come free with the phone (tell that to Apple!), you see it in the confusion users who closed the last window of an application have if the menu bar doesn’t belong to the (inactive) frontmost window, you see it in the casual way in which people type their password into any web site that claims to need it. The distinction between a browser/operating system and the actual applications/sites running in it is unclear.
Certainly, some of this confusion stems from the fact that this is confusing. An application with no windows, only a thin menu bar indicating it is still there is such a small clue that application developers should work hard on avoiding this situation. The system asks for passwords in so many situations without a non-geek explanation, without any cause obvious to the user. If Mail.app asks for a new password on any error, even if the error was not an authentication failure, just to cover a few edge cases, the user is bound to get used to arbitrarily type in the password. If the user has no way of distinguishing valid from invalid password requests anymore, then the added security is lost, and all that remains is an annoyance. It’s like “Security theatre”.
However, some of the confusion may come from the users’ mental model. Every user has one. Most of them are built alone, simply by observing behaviour coming out of the machine, without the inside knowledge we computer engineers have. If your mental model of how a computer works was built twenty years ago based on outside behavior of a completely different system than we have today, it’s no surprise that some spots where you filled in the blanks might lead you to the wrong conclusion. I’m not blaming the user. Most of the model is correct and works. How would you know part of it is wrong?
Just like people in the original Mac days thought users would not understand keyboards, I hear people today saying that users will never understand multi-tasking, will never understand what an “application” or an “app” or a “web site” are, and how they differ and how they are the same.
I don’t see it.
Humanity has adapted to changes in the world for millennia. They are flexible enough to understand these concepts. It took about 30 years for keyboards to become well-known enough that the basics of keyboard use did not have to be explained anymore (even if the “alt” key still mystifies many). People learn to cope with things they need, and they get used to the things they are confronted with every day.
More than now, where people still rely on a vendor to give them their applications with the hardware, the future will include people getting “apps”. Like children’s TV shows today warn kids of expensive call-in TV shows and shady ringtone subscriptions, the future will see them mention apps and purchases. As ruthless as it may sound, the truth of the matter is that, within less than a generation, people unfamiliar with the concepts will have died out. At least in the computerized so-called “western world”.
You’re kidding, right?
No. Though I’m simplifying. Of course, this is a two-sided development. Users will become more familiar with computer concepts, just as they now have a basic understanding of power outlets and lightbulbs. Similarly, technology will become more approachable. Applications of the future may not have some of the issues that confuse users today, but the general concept will stay present and will have to be understood.
Just you wait and see.
Update: Finally found a clip with the exact Steve Jobs quote, so linked to it and adjusted the article title from “They’ll die out eventually”.