The Flawed History of Graphical User Interfaces

Why innovation in computing has been stymied for decades
Computer operating systems as we know them should have never developed the way they did. A return to innovation would improve user interfaces across the board.

This is my current desktop wallpaper:

For those who don’t recognize it, that’s the Xerox Alto, the first personal computer from the 1970s and, arguably, the source of the “desktop” metaphor for computer interfaces.

I don’t believe the Alto should have never happened as much as I did when I first created the wallpaper. If I did it again today, the computer pictured would be a first-generation Macintosh, not an Alto. After all, the mistakes that ruined graphical user interfaces (GUIs) forever didn’t happen at Xerox’s Palo Alto Research Center, aka PARC. They were dictated by Steve Jobs to a team that already knew better than to think any of it—the rejection of composition, the paternalism toward users, the emphasis on fragile, inflexible demo-tier applications that look impressive in marketing material but aren’t usable outside extremely narrow and artificial situations — was okay.

In one sense, the computer revolution is over because the period of exponential growth behind the tech ended 10 years ago. In another sense, it hasn’t yet begun. We’ve sheltered ourselves from the social and intellectual ramifications of computing. Documents are still simulations of paper and capitalism still exists. It’s like that period when printing presses existed but everybody used a faux-calligraphic font.

The WIMP (windows, icons, menus, pointer) paradigm is not the only way to create a user interface. In fact, the Unix command line is itself a UI, with its own idioms—distinct from other text-based UIs (such as the AS/400, or ColorForth, or any programming language with a REPL)—and can usefully inform other kinds of UIs, as we see with notebook interfaces, the appearance of autosuggest and autocomplete on phones, and the integration of history in chat applications.

What should have never happened wasn’t the Alto itself, but the widespread adoption of a shallow, dysfunctional misrepresentation of PARC’s ideas behind the Alto. Because, of course, our ideas about what a GUI looks like (the standard WIMP policies of totally independent applications written by strangers, manifesting as a set of overlapping windows that are primarily interacted with through pointing at and clicking large icons) comes out of an attempt to duplicate the look and feel of a short Alto demo (seen by a non-programmer with no technical skills) on an already outdated machine in assembly language during a time and budget crunch.

The Lisa, Apple’s first shot at a clone of the Alto… was underpowered, overpriced, and failing at market.

The unifying ideas of modern GUI design crystallized in the Macintosh project and came from Steve Jobs’ “correction” of the behavior of the developers who were doing the work.

This is the standard set of beliefs from the Apple school of UI design post-1983, phrased in a slightly less charitable but more accurate way than they would have phrased it:

  1. The end user is stupid and doesn’t know what they want or need, so the responsibility falls to the UI designer to create a system that cannot be overridden—because the user would just screw it up.
  2. The end user is stupid and can’t be trusted to learn how to perform tasks. It’s up to the designer to create turnkey systems, called applications, that make decisions for the user and are useful for only one particular way of performing one particular task. The applications cannot be combined together or used in tandem because the user wouldn’t be able to conceptualize the idea of two things working together.
  3. The end user is stupid and can’t be trusted to take care of their own possessions, so the machine has no expansion ports and contains no user-serviceable parts inside.

Nobody on the team actually believed these things, as far as I can tell.

The actual reasons for the decisions were time and budget pressures: The Lisa, Apple’s first shot at a clone of the Alto, had been released. It was underpowered, overpriced, and failing at market. Jobs left the team working on the Lisa and shifted his attention to a new project at Apple that he recognized as more marketable: the Macintosh.

Jef Raskin had started the Macintosh project a couple years earlier, but he left Apple over disagreements with Jobs, who took the project in a different direction. (Raskin’s Macintosh was eventually released as the Canon Cat and is interesting in its own right.) The Macintosh under Jobs basically became a budget Lisa, with Jobs taking all of Raskin’s common-sense advice about the Lisa to extremes while pretending they were his ideas. The end result looked vaguely like the Alto’s interface to someone who had never used one or heard a detailed description of how it works. But it contained none of the interesting ideas from the Alto.

Of course, after Jobs was forced out in 1985, Apple started drinking his Kool-Aid, and after 1987, the whole industry was drinking it as well. When Apple became cool again in 1997 (after 12 years of the Macintosh brand bombing and Apple narrowly avoiding bankruptcy by leaning heavily into deep-discount deals with public schools), people started embracing this Jobs ethos more explicitly, simultaneous with Jobs return to Apple. In the wake of the dot-com crash, a popular book came out called Don’t Make Me Think that articulated Jobs’ paternalistic attitude toward users for web developers and other folks who lived through the years of Macintosh being a pariah.

“Worse is better” might have won, but it doesn’t need to keep winning.

This fixation on the myth of the Macintosh, even when it comes down to clearing up misconceptions, is part of the problem. The Macintosh looms large because it was so influential, but everything good about GUIs came through the Macintosh from the Alto, while almost everything bad about GUIs came from mistakes made by the Macintosh team—later spun by marketing materials as intentional. The Macintosh shouldn’t overshadow actually interesting GUI experiments that aren’t Alto-derived (like Sketchpad, NLS, and Swyft) or the actual heirs of the Alto legacy (Squeak) or, for that matter, interesting tangents like NeWS.

The most useful thing we can do, when trying to imagine a desirable future for computing, is to try to forget the entire Macintosh legacy and send our minds back to 1976—because that’s when all the innovation and meaningful design stalled out in favor of a paternalistic approach that’s perpetuated ever since.

Nevertheless, it’s important as a kind of memento mori: Remember that everything you consider solid—all this path-dependence—is a side effect of decisions made by people who aren’t substantially smarter than you are. And those decisions didn’t occur that long ago. “Worse is better” might have won, but it doesn’t need to keep winning. We can stop it. All we need is luck and courage—and if enough of us have courage, we won’t need the luck.