Re: Guest rant from ESR

[prev] [thread] [next] [lurker] [Date index for 2004/04/18]

From: peter (Peter da Silva)
Subject: Re: Guest rant from ESR
Date: 20:16 on 18 Apr 2004
	"The user interface -- the entire user experience -- should be
	 designed first, and the underlying implementation should be
	 built to support the design."

Yes and no. Like every other broad statement, this one is only partially
true. I'd say it's maybe 20-30% true, which is actually pretty good as
broad statements go. But you gotta draw a line somewhere and say 'OK, this
part is the bit we're designing around the GUI, and these parts here are
what we're building that on' because otherwise, well, you get Mac OS 9.

CUPS is a crappy example. The parts of CUPS that are embedded like a cyst
inside Mac OS X printing need to be ripped out and refactored and shoved back
in with a better user interface. "Now you connect to port 631 or whatever
it is and finish the configuration in Safari" is an abomination.

Mac OS X itself is a good example. Particularly when you compare Mac OS X
to Mac OS 9. Mac OS, the classic Mac OS, is a perfect example of what you
get when you design everything from the ground up around the user interface.
The first version barely had an operating system underneath, and you had
people declaring there shouldn't be one at all. The result was fifteen years
of increasingly frantic attempts to provide operating system services in
the user interface, then realising you need an OS after all and wedging
it in bit by bit without breaking anything.

Redesigning an application platform that's been badly factored without
breaking working applications that have made broad assumptions about the
environment is *hard*. Apple finally gave up and started over with OS X,
Microsoft has managed to avoid this by spending more money than anyone
else in the world on adding layers of ugly paint to hide the layers of
ugly paint undereneath, Commodore-Amiga didn't survive long enough, but
their decision not to enforce shared-versus-private memory in apps was
clearly coming back to haunt them... and they started with what may be
the best designed personal computer OS of the '80s or early '90s.

So even when you DO try and get the factoring right, you can end up with
a mess. But if you just start at the top and work down you're guaranteed
to end up with something that's fatally dependent on your original UI
decisions.

Oh, and figuring out where to draw the line, that's tough. Thompson and
Ritchie back in 1969 managed to find a good place to draw the line for the
user interface of the day: the command line. They had an OS that provided
file, scheduling, and I/O services to applications in a way that hid most
of the complexity, and they had a command line user interface built on top
of this that ran applications but ignored everything but their input and
output while they were running. It was a good mix, for command lines.

The X-Windows people tried to use similar factoring in X11: you would have
shells, the window manager and the X server, that launched programs and
hooked windows together and let the user decide what program he was going
to interact with... and left everything else to the application. The OS,
they were running on top of that... and whether it was VMS or UNIX they
treated it like that old 1969 OS.

This didn't work, because it wasn't enough to let the applications just throw
anything they wanted to at the user. They were experimenting with GUIs, so
they provided a pretty bare-bones GUI toolkit and let anyone else develop
their own. So pretty soon you have everyone in the world making their own
toolkit, and none of them worked well with any of the others.

Apple, now, they took the old Apple II and CP/M approach to the OS, and spent
a lot of time designing the best toolkit they cood on top of as small an OS
as they could get away with. Good user interface, applications worked well
with each other ... one at a time, and it just worked. Problem is, it wasn't
very flexible. And 15 years later OS 9 was still crippled by comparison with
other operating systems ... plus it was bigger and slower as well!

You can't go with spray-on system architecture, either. You need to break
things up into chunks that are small enough for people to deal with.

OK, that's subrant 1.

	"It all boils down to the fact that most aspects of Mac OS
	 X are not designed to be configurable or replacable; they
	 are designed to be usable, and to fit in with the design
	 of the rest of the system."

Subrant 2, short form... there's a lot of design decisions in Mac
OS X that's clearly there for *branding*, not because it's an
essential part of a good user experience. Apple has at various
times done a lot of work to make the Mac OS environment configurable
in a way that is entirely under the user's control, not the
developers. The fact that I can change one default and get rid of
the "metal" look in any properly designed Cocoa application, for
one obvious example, is just the top of the iceberg. All the theming
people are doing on Mac OS X is based on things Apple has been
putting in Mac OS since OS 8. The fact that these things are not
available to the user has nothing to do with them being "designed
to be usable", it has to do with the Apple BRAND. My Mac OS X
desktop doesn't have the Apple Brand... the "mighty blue apple"
and the pinstripes and jello... but it's no less usable than yours.

It's not the applications that fit into the design of the system
that aren't configurable and replacable. It's the mavericks like
iTunes and the new Finder and Quicktime Player. If they were all
well behaved Cocoa applications they would fit better into the
design of the system *and* be more configurable. Whether or not
the design flaunted the Mighty Apple Brand.

There's stuff above here

Generated at 14:02 on 01 Jul 2004 by mariachi 0.52