May 14th, 2018

Hard Stare

On GNOME 3 and design simplicity

(Being a sort of coda to Why I don't use GNOME Shell.)

[EDIT: copy-pasta fixed. Sorry about that.]

Someone on the Ubuntu user list was saying that they gave up on Nautilus in GNOME 3 when the developers removed the split-pane feature.

That in itself wasn't a deal-breaker for me, but the removal of support for desktop icons more or less is. I also dislike the desktop layout. GNOME 3 fans tell me "it keeps out of my way" but that huge top panel, almost totally unused, is an egregious waste of space. Along with desktop icons, notification icons in the top panel are now deprecated. The username/network/volume/brightness controls are all merged into 1, for no good reason I can see.

At least Unity put the menus in there -- a good big target to hit:

This is the thing that irks me.

Many parts of older UIs, back in the 1980s when things were still developing, were designed one the basis of solid academic research. So, for example, Fitt's Law is behind the Mac's top menu bar.

Lots of people curse at it, but they don't realise there is science behind it.

Microsoft, constrained by avoiding a look-and-feel lawsuit, moved the menu bar from the top of the screen to in the window. It's a much harder target to hit, but it's different and that was the main thing.

(Acorn didn't have menu bars at all. Everything is context menus. They take no screen space at all, ideal for a desktop displaying on a CGA-res monitor (640*200 at first, or 640*350), and requiring no mouse movement at all -- but you have to know they're there and what to do to summon them.)

But once there was a difference, people started to form preferences, and holy wars raged over it.

Take Apple's single mouse button. There are studies, with solid numbers. It takes thought to pick what button to click. A lot for beginners, a fraction of a second for experts, but thought, every time. So Apple reduced it to one.

Microsoft, appealing to "power users", gave you 2. The original Unix machines, and Acorn, 3.

3 is more powerful, but it takes decision-making time.

That, and having to aim at in-window menu bars, has wasted millions, billions, of man-hours across the world over 3½ decades.

In System 7, Apple made the titles of aliases italic. You can't set filenames in italics, so if the filename was in italics, it wasn't you. It was the system telling you something -- that this wasn't the original file, it was a pointer to it.

In Win95, Microsoft couldn't do that, because look-and-feel lawsuits, so it put a little curvy arrow in the corner. Easier to miss, but perhaps more logical. Later, Apple copied that back again. (!)

Lawsuits and holy war. Powerful reasons, but bogus ones. Some people don't like Apple's choices, but Apple had reasons for making those choices.

Now, happily, that mouse-button/menu-bar stuff is moot, because of touchscreens.

But the GNOME devs, in the admirable pursuit of simplicity and a desktop that's as easy as a phone, are not doing the science. I suspect they don't even know the research existed. I'm damned sure they didn't study the research.

They're just identifying features they don't use, and removing them. No consultation, no research, just "we can get rid of that".

But it is virtually an axiom: you cannot get to a simple design by starting with a complicated design and removing bits.

Simplicity has to be the goal from the start.

You can't write a haiku by starting with a novel and removing words.

But that's what they are trying to do.