Would you rather have a chauffeur or a Ferrari?

Dan Bricklin commented in a recent interview on how the expectations of computers from science fiction have not panned out. The point is not that computers are more or less powerful than expected, but that we have wanted to put computers to different uses than expected.

photo of red Ferrari

Fictional computers such as the HAL 9000 from 2001: A Space Odyssey were envisioned as chauffeurs. You tell the computer what to do and then go along passively for the ride. Bricklin says it looks like people would rather have a Ferrari than a chauffeur. We want our computers to be powerful tools, but we want to be actively involved in using them.

I’d refine that to say we either want to actively use our computers, or we want them to be invisible. Maybe there’s an uncanny valley between these extremes. Most people are blissfully ignorant of the computers embedded in their cars, thermostats, etc. But they don’t want some weird HAL 9000-Clippy hybrid saying “Dave, it looks like you’re updating your résumé. I’ll take care of that for you.”

Update: See Chauffeurs and Ferraris revisited.

10 thoughts on “Would you rather have a chauffeur or a Ferrari?

  1. John,

    In the interview you were listening to I was making reference to chapter 7 of my new book, Bricklin on Technology, “Tools: My Philosophy about What We Should Be Developing.” The specific essay that the chauffeur quote came from is on my web site as “The ‘Computer As Assistant’ Fallacy”, but I have lots more written in the book, which includes other essays (including one not on my web site yet) and commentary. I agree, as you point out here, there are lots of times when you don’t want to interact much and have the computer carry on without you (and I addressed that a little in the essay). When you do want to have control (like in a thermostat) you want the right interface. I cover that a lot in that chapter of the book.

    I’m glad you found the interview stimulating.


  2. This reminds me of Eric S. Raymond’s The Revenge of the Hackers essay, in which he describes Linux as:

    … a gleaming red Ferrari, door open, keys swinging from the lock and engine gently purring with a promise of power…

    Give me a nice shiny road bike any day!

  3. @Thomas I’m a big fan of Linux, but the Ferrari analogy seems off to me. It’d be better to describe Linux as an eerily glowing spaceship hovering a meter off the ground with no discernible opening, an inscrutable control system labelled in alien hieroglyphs, and warp drive reactor emitting a somewhat disconcerting high-pitched trill on the edge of human hearing.

    Sure, sufficiently technically sophisticated folks can pick up Linux fairly easily but such skills are as to alien linguistics and quantum mechanics to ordinary computer using folk.

  4. I’ve long contended that computers should be like refrigerators. You plug them in and don’t think about them. When was the last time your refrigerator was hacked? Home many times does your wife call you at work about the refrigerator not printing or email not working? When was the last time your neighbor asked you to come over and fix the refrigerator? Is your ice maker more reliable than your printer driver?

  5. My impression is that people are defining the “Ferrari-ness” of a computer system based on what they actually do. For example: I make my own graphs in R and write my own computer programs (sometimes even in Fortran). Does that mean that I’m using a Ferrari? Maybe, but at the same time I use the standard operating system, word processors, etc.: so maybe I’m using a chauffeur. For just about any tool you use, there will be a balance of what aspects you use automatically and what aspects you customize. This balance will be in different places for different people. I think the idea of Ferrari vs. chauffeur has more to do with people’s self-perceptions (“expert” vs. “helpless”, or maybe “competent” vs. “too important to need to worry about the details”) rather than how people are actually using the system.

  6. I think people are missing the point about my chauffeur vs. Ferrari analogy. I was referring to chapter 7 of my book which includes these essays (among others) that you should read:

    The “Computer as Assistant” Fallacy
    Metaphors, Not Conversations
    Why Johnny can’t program

    When I said “chauffeur” I mean something that does it for you in a non-transparent way that you have to just trust will do it right and can’t check how it’s doing it. By “Ferrari” I meant something that is responsive and that gives you lots of control when you want it. (You can drive a Ferrari around town to buy groceries but you can also make very sharp turns, accelerate fast, feel the road, etc.) You don’t trust your refrigerator to know what temperature to keep things at — you need to choose yourself between the main part and the freezer, but it’s obvious how to do that. I didn’t mean simple vs. complex, I meant black-box automatic vs. wysiwyg with many obvious ways to get fine control that you use yourself.

    As I point out in another essay in the book, When The Long Tail Wags The Dog, what people want is the ability to do anything when they want and not be stuck with what the system does automatically as envisioned by a developer.

    Here’s the quote from the beginning of the “Assistant” essay:

    There has been a lot of talk lately about how computers are too hard to learn to use. There is a longing for devices you can just pick up and use without training. Microsoft’s Kai-fu Lee was quoted in The New York Times as saying, when discussing the more “natural and intelligent” user interfaces he hopes to create, “My dream is that the computer of the future is going to be an assistant to the user.”

    This type of thinking strikes me as strange. We don’t ask for our automobiles to be more natural and intelligent, nor do we call for the next generation of cars to be like chauffeurs. With cars, we talk about responsiveness, comfort, power, cargo size, and safety. Tools are effective and appropriate to the task. Learning to use them is part of being human.

    While a goal of simplicity may be worthwhile for many infrequently used devices that happen to use computing power, I have a real problem with this view of the computer in general, and especially the personal computer. I believe that the computer has a very important role to play in our society, and that that role will require us to continue to deal with its quirks and special needs.

    I hope this makes things a little clearer.

  7. I thought about this one for a while — could it be that we want a magic chauffeur who lets us pretend we’re driving? Often, I’m asked to write a tool that abstracts away everything difficult, but still leaves a few inconsequential tasks for the user; it seems like the user feels threatened by a system that removes the illusion of their relevance to the action.

Comments are closed.