Let’s pretend this never happened

This is my current desktop wallpaper:

Let’s pretend this never happened

This is my current desktop wallpaper:

I don’t believe in this as much as I did when I first made it. If I did it today, the computer would be a first-generation Macintosh, not an Alto. The mistakes that ruined GUIs forever — the rejection of composition, the paternalism toward users, the emphasis on fragile inflexible demo-tier applications that look impressive in marketing material but aren’t usable outside extremely narrow & artificial situations — weren’t invented at PARC but imposed by fiat by Jobs upon a team that already knew better than to think any of it was OK.

Nevertheless, it’s important as a kind of memento mori: remember that everything you consider solid — all this path-dependence — is a side effect of the decisions of people who aren’t substantially smarter than you are, and those decisions didn’t occur very long ago. “Worse is better” might have won, but it doesn’t need to keep winning. We can stop it. All we need is luck and courage — and if enough of us have courage, we won’t need the luck.

In one sense, ‘the computer revolution is over’ because the period of exponential growth behind the tech ended 10 years ago. In another sense, it hasn’t begun: we have sheltered ourselves from the social and intellectual ramifications of computing. Documents are still simulations of paper, & capitalism still exists. So it’s like that period where printing presses existed but everybody used a faux-calligraphic font.

The WIMP paradigm is not the only way to do a UI. In fact, the unix command line is itself a UI, with its own idioms — distinct from other text-based UIs (such as the AS/400, or ColorForth, or literally any programming language with a REPL) — and a UI that can usefully inform other kinds of UIs (as we see with notebook interfaces, the appearance of autosuggest and autocomplete on phones, and the integration of history in chat applications).

What should ‘never have happened’ is the widespread adoption of a shallow, dysfunctional misrepresentation of PARC’s ideas. Because, of course, our ideas about what a GUI looks like (the standard WIMP policies of totally independent applications written by strangers, manifesting as a set of overlapping windows that are primarily interacted with through pointing at and clicking large icons) comes out of an attempt to duplicate the look and feel of a short Alto demo (seen by a non-programmer with no technical skills) on an already-outdated machine in assembly language during a time and budget crunch.

The unifying ideas of modern GUI design crystallized in the Macintosh project, and came from Steve Jobs’ “correction” of the behavior of the developers who were doing the work:

1. The end user is stupid & doesn’t know what they want or need; therefore, the responsibility falls upon the UI designer to create a system that cannot be overridden (because the user would just screw it up)

2. The end user is stupid & can’t be trusted to learn how to perform tasks; therefore, it is up to the designer to create turnkey systems called ‘applications’ that make decisions for the user and are useful for only one particular way of performing one particular task. The applications cannot be combined together or used in tandem, because the user wouldn’t be able to conceptualize the idea of two things working together anyhow.

3. The end user is stupid & can’t be trusted to take care of their own possessions; therefore, the machine has no expansion ports & contains no user-serviceable parts inside.

This is the standard set of beliefs from the Apple school of UI design post-1983, phrased in a slightly less charitable way than they themselves would phrase it (though not misrepresented in the least). Nobody on the team actually believed these things, as far as I can tell.

The actual reason for the decisions this policy was designed to justify was the time and budget pressure: the Lisa — Apple’s first shot at a clone of the Alto — had been released, was under-powered and overpriced, and was failing at market; the Macintosh was Jobs’ revenge on Jef Raskin (who had sent internal memos pointing out that the Lisa team wasn’t accounting for the machine’s power when developing software for it). Raskin had started the Macintosh project a couple years earlier, and Jobs fired him & took the project in a different direction after the Lisa was released. (Raskin’s Macintosh was eventually released as the Canon Cat, and is a genuinely interesting machine with a genuinely interesting UI.) The Macintosh under Jobs became a budget Lisa, with Jobs taking all of Raskin’s common-sense advice about the Lisa into extreme territory (while pretending they were his ideas). The end result is something that looks vaguely like the Alto’s interface to someone who has never used one or heard a detailed description of how one works, but that contains none of the interesting ideas from the Alto and no ideas that weren’t already present in it.

Of course, after Jobs got fired, Apple started drinking his kool-aid, and after 1987 the whole industry started drinking it. When Apple became cool again in 1997 (after twelve years of the Macintosh brand bombing & Apple narrowly avoiding bankruptcy by leaning heavily into deep-discount deals with public schools), people started embracing this ethos more explicitly. (In the wake of the dot com crash, a popular book came out called “Don’t Make Me Think” that articulated Jobs’ paternalistic attitude toward users for web developers & other folks who lived through twelve years of Macintosh being a pariah.)

This fixation on the myth of the Macintosh, even when it comes down to clearing up misconceptions, is part of the problem. The Macintosh looms large because it was so influential, but everything good about GUIs came through the Macintosh from the Alto, while almost everything bad about GUIs comes from mistakes that were made by the Macintosh team and later spun by marketing materials as intentional. The Macintosh shouldn’t overshadow actually-interesting GUI experiments that aren’t Alto-derived (like Sketchpad, NLS, and Swyft) or the actual heirs of the Alto legacy (Squeak), or for that matter interesting tangents like NeWS.

The most useful thing we can do, when trying to imagine a desirable future for computing, is to try to forget the entire Macintosh legacy and send our minds back to 1976.