?

Log in

Wed, Apr. 19th, 2017, 09:12 pm
The state of the Linux desktop

A summary of where were are and where we might be going next.

Culled from a couple of very lengthy CIX posts.

A "desktop" means a whole rich GUI with an actual desktop -- a background you can put things on, which can hold folders and icons. It also includes an app launcher, a file manager, generally a wastebin or something, accessory apps such as a text editor, calculator, archive manager, etc. It can mount media and show their contents. It can unmount them again. It can burn rewritable media such as CDs and DVDs.

The whole schmole.

Some people don't want this and use something more basic, such as a plain window manager. No file manager, or they use the command line, or they pick their own, along with their own text editor etc., which are not integrated into a single GUI.

This is still a GUI, still a graphical UI, but may include little or nothing more than window management. Many Unix users want a bunch of terminals and nothing else.

A desktop is an integrated suite, with all the extras, like you get with Windows or a Mac, or back in the day with OS/2 or something.

The Unix GUI stack is as follows:
Read more...Collapse )

Fri, Mar. 31st, 2017, 02:58 pm
The art of Sinclair -- in Agile terms, making computers that are "just barely good enough"

So in a thread on CIX, someone was saying that the Sinclair computers were irritating and annoying, cut down too far, cheap and slow and unreliable.

That sort of comment still kinda burns after all these decades.

I was a Sinclair owner. I loved my Spectrums, spent a lot of time and money on them, and still have 2 working ones today.

Yes, they had their faults, but for all those who sneered and snarked at their cheapness and perceived nastiness, *that was their selling point*.

They were working, usable, useful home computers that were affordable.

They were transformative machines, transforming people, lives, economies.

I had a Spectrum not because I massively wanted a Spectrum -- I would have rather had a BBC Micro, for instance -- but because I could afford a Spectrum. Well, my parents could, just barely. A used one.

My 2nd, 3rd and 4th ones were used, as well, because I could just about afford them.

If all that had been available were proper, serious, real computers -- Apples, Acorns, even early Commodores -- I might never have got one. My entire career would never have happened.

A BBC Micro was pushing £350. My used 48K Spectrum was £80.

One of those is doable for what parents probably worried was a kid's toy that might never be used for anything productive. The other was the cost of a car.
Read more...Collapse )

Tue, Mar. 28th, 2017, 12:40 am
The successors to the Z80-based micros of the early 1980s which never happened. Or did they?




Although we almost never saw any of them in Europe, there were later models in the Z80 family.

The first successors, the Z8000 (1985, 16-bit) and its later successor the Z80000 (1986, 32-bit) were not Z80-compatible. They did not do well.

Zilog did learn, though, and the contemporaneous Z800, which was Z80 compatible, was renamed the Z280 and relaunched in 1987. 16-bit, onboard cache, very complex instruction set, could handle 16MB RAM.

Hitachi did the HD64180 (1985), a faster Z80 with an onboard MMU that could handle 512 kB of RAM. This was licensed back to Zilog as the Z61480.

Then Zilog did the Z180, an enhancement of that, which could handle 1MB RAM & up to 33MHz.

That was enhanced into the Z380 (1994) -- 16/32-bit, 20MHz, but not derived from and incompatible with the Z280.

Then came the EZ80, at up to 50MHz. No MMU but 24-bit registers for 16MB of RAM.

Probably the most logical successor was the ASCII Corp R800 (1990), an extended 16-bit Z800-based design, mostly Z80 compatible but double-clocked on a ~8MHz bus for ~16MHz operation.

So, yes, lots of successor models -- but the problem is, too many, too much confusion, and no clear successors. Zilog, in other words, had the same failure as its licensees: it didn't trade on the advantages of its previous products. It did realise this and re-align itself, and it's still around today, but it did so too late.

The 68000 wasn't powerful enough to emulate previous-generation 8-bit processors. Possibly one reason why Acorn went its own way with the ARM, which was fast enough to do so -- the Acorn ARM machines came equipped with an emulator to run 6502 code. It emulated a 6502 "Tube" processor -- i.e. in an expansion box, with no I/O of its own. If your code was clean enough to run on that, you could run it on RISC OS out of the box.

Atari, Commodore, Sinclair and Acorn all abandoned their 8-bit heritage and did all-new, proprietary machines. Acorn even did its own CPU, giving it way more CPU power than its rivals, allowing emulation of the old machines -- not an option for the others, who bought in their CPUs.

Amstrad threw in the towel and switched to PC compatibles. A wise move, in the long view.

The only line that sort of transitioned was MSX.

MSX 1 machines (1983) were so-so, decent but unremarkable 8-bits.

MSX 2 (1985) were very nice 8-bitters indeed, with bank-switching for up to 4MB RAM, a primitive GPU for good graphics by Z80 standards. Floppy drives and 128 kB RAM were common as standard.

MSX 2+ (1988) were gorgeous. Some could handle ~6MHz, and the GPU has at least 128 kB VRAM, so they had serious video capabilities for 8-bit machines -- e.g. 19K colours.

MSX Turbo R (1990) were remarkable. Effectively a ~30MHz 16-bit CPU, 96 kB ROM, 256 kB RAM (some battery-backed), a GPU with its own 128 kB RAM, and stereo sound via multiple sound chips plus MIDI.

As a former Sinclair fan, I'd love to see what a Spectrum built using MSX Turbo R technology could do.


Postscript

Two 6502 lines did transition, kinda sortof.

Apple did the Apple ][GS (1986), with a WD65C816 16-bit processor. Its speed was tragically throttled and the machine was killed off very young so as not to compete with the still-new Macintosh line.

Acorn's Communicator (1985) also had a 65C816, with a ported 16-bit version of Acorn's MOS operating system, BBC BASIC, the View wordprocessor, ViewSheet spreadsheet, Prestel terminal emulator and other components. Also a dead end.

The 65C816 was also available as an add-on for several models in the Commodore 64 family, and there was the GEOS GUI-based desktop to run on it, complete with various apps. Commodore itself never used the chip, though.

Mon, Mar. 6th, 2017, 02:37 pm
Follow-up: the family links between DOS, OS/2, NT and VMS

My previous post was an improvised and unplanned comment. I could have structured it better, and it caused some confusion on https://lobste.rs/

Dave Cutler did not write OS/2. AFAIK he never worked on OS/2 at all in the days of the MS-IBM pact -- he was still at DEC then.

Many sources focus on only one side of the story -- the DEC side, This is important but only half the tale.

IBM and MS got very rich working together on x86 PCs and MS-DOS. They carefully planned its successor: OS/2. IBM placed restrictions on this which crippled it, but it wasn't apparent at the time just how bad this would turn out to be.

In the early-to-mid 1980s, it seemed apparent to everyone that the most important next step in microcomputers would be multitasking.

Even small players like Sinclair thought so -- the QL was designed as the first cheap 68000-based home computer. No GUI, but multitasking.

I discussed this a bit in a blog post a while ago: http://liam-on-linux.livejournal.com/46833.html

Apple's Lisa was a sideline: too expensive. Nobody picked up on its true significance.

Then, 2 weeks after the QL, came the Mac. Everything clever but expensive in the Lisa stripped out: no multitasking, little RAM, no hard disk, no slots or expansion. All that was left was the GUI. But that was the most important bit, as Steve Jobs saw and nobody much else did.

So, a year later, the ST had a DOS-like OS but a bolted-on GUI. No shell, just a GUI. Fast-for-the-time CPU, no fancy chips, and it did great. It had the original, uncrippled version of DR GEM. Apple's lawsuit meant that PC GEM was crippled: no overlapping windows, no desktop drive icons or trashcan, etc.

Read more...Collapse )

Sat, Mar. 4th, 2017, 04:20 pm
The family link between OS/2 and Windows NT

Windows NT was allegedly partly developed on OS/2. Many MSers loved OS/2 at the time -- they had co-developed it, after all. But there was more to it than that.

Windows NT was partly based on OS/2. There were 3 branches of the OS/2 codebase:

[a] OS/2 1.x – at IBM’s insistence, for the 80286. The mistake that doomed OS/2 and IBM’s presence in the PC industry, the industry it had created.

[b] OS/2 2.x – IBM went it alone with the 80386-specific version.

[c] OS/2 3.x – Portable OS/2, planned to be ported to multiple different CPUs.

After the “divorce”, MS inherited Portable OS/2. It was a skeleton and a plan. Dave Cutler was hired from DEC, which refused to allow him to pursue his PRISM project for a modern CPU and successor to VMS. Cutler got the Portable OS/2 project to complete. He did, fleshing it out with concepts and plans derived from his experience with VMS and plans for PRISM.

Read more...Collapse )

Wed, Mar. 1st, 2017, 05:15 pm
I was challenged to write something positive about computing research, with concrete suggestions.

When was the last time you saw a critic write a play, compose a symphony, carve a statue?

I've seen a couple of attempts. I thought they were dire, myself. I won't name names (or media), as these are friends of friends.

Some concrete examples. I have given dozens on liam-on-linux.livejournal.com, but I wonder if I can summarise.

[1]

Abstractions. Some of our current core conceptual models are poor. Bits, bytes, directly accessing and managing memory.

If the programmer needs to know whether they are on a 32-bit or 64-bit processor, or whether it's big-endian or little-endian, the design is broken.

Higher-level abstractions have been implemented and sold. This is not a pipedream.

One that seems to work is atoms and lists. That model has withstood nearly 50Y of competition and it still thrives in its niche. It's underneath Lisp and Scheme, but also several languages far less arcane, and more recently, Urbit with Nock and Hoon. There is room for research here: work out a minimal abstraction set based on list manipulation and tagged memory, and find an efficient way to implement it, perhaps at microcode or firmware level.

Read more...Collapse )

Mon, Feb. 13th, 2017, 07:51 pm
USB C. Everyone's complaining. I can't wait. I still hope for cable nirvana.

Things have been getting better for a while now. For smaller gadgets, micro-USB is now the standard charging connector. Cables are becoming
a consumable for me, but they're cheap and easy to find.

But it only goes in one way and it's hard to see and to tell. And not all my gadgets want it the same way round, meaning I have to either remember or peer at a tiny socket and try to guess.

So conditions were right for an either-way-round USB connector.


Read more...Collapse )

Mon, Feb. 6th, 2017, 03:35 pm
On being boggled by technological advances [tech blog b

I had Sinclair Microdrives on my ZX Spectrum. They were better than tape cassette but nothing else -- ~90 kB of slowish, rather unreliable storage.

So I bought an MGT DISCiPLE and an old cheap 5¼" 80-track, DS/DD drive.

780 kB of storage! On demand! Programs loaded in seconds! Even when I upgraded to an ex-demo 128K Spectrum from Curry's, even 128 kB programs loaded in a (literal) flash!

(MGT's firmware strobed the Spectrum's screen border, in homage to loading from tape, so you could see the data streaming into memory.)

That was the first time I remember being excited by the size and speed of my computer's storage.
Read more...Collapse )

Sun, Jan. 15th, 2017, 03:36 pm
How I chose & bought a cheap Chinese smartphone [tech blog post, by me]

So... when the lack of apps for my beloved Blackberry Passport, and the issues with running sideloaded Android apps, became problematic, I decided to check out a cheap Chinese Android Phablet.

(P.S. The Passport is for sale! Let me know if you're interested.)

The Passport superseded a Samsung Galaxy Note 2, which subsequently got stolen, unfortunately. It was decent, occasionally sluggish, ran an elderly version of Android with no updates in ages, and had a totally useless stylus I never used. It replaced an iPhone 4 which replaced an HTC Desire HD, which replaced a Nokia Communicator E90 -- the best form-factor for a smartphone I've ever had, but nothing like it exists any more.

I wanted a dual-core or quad-core phablet, bigger than 5.5", with dual SIM and a memory card. That was my starting point.  I don't have or use a tablet and never have -- I'm a keyboard junkie. I spend a lot of time surfing the web, on social networks, reading books and things on my phone. I wanted one as big as I could get, but still pocketable. My nicked Samsing was 5.5" and I wanted a little larger. I tried a 6" phablet in a shop and wanted still bigger if possible. I also tried a 6.8" Lenovo Phab Pro in a shop and that was a bit too big (but I might be persuaded -- with a tiny bezel, such a device might be usable).
Read more...Collapse )

Fri, Nov. 11th, 2016, 03:54 pm
Why I don't use GNOME Shell

Although the launch of GNOME 3 was a bumpy ride and it got a lot of criticism, it's coming back. It's the default desktop of multiple distros again now. Allegedly even Linus Torvalds himself uses it. People tell me that it gets out of the way.

I find this curious, because I find it a little clunky and obstructive. It looks great, but for me, it doesn’t work all that well. It’s OK — far better than it was 2-3 years ago. But while some say it gets out of the way and lets them work undistracted, it gets in my way, because I have to adapt to its weird little quirks. It will not adapt to mine. It is dogmatic: it says, you must work this way, because we are the experts and we have decided that this is the best way.

So, on OS X or Ubuntu, I have my dock/launcher thing on the left, because that keeps it out of the way of the scrollbars. On Windows or XFCE, I put the task bar there. For all 4 of these environments, on a big screen, it’s not too much space and gives useful info about minimised windows, handy access to disk drives, stuff like that. On a small screen, it autohides.

But not on GNOME, no. No, the gods of GNOME have decreed that I don’t need it, so it’s always hidden. I can’t reveal it by just putting my mouse over there. No, I have to click a strange word in the menu bar. “Activities”. What activities? These aren’t my activities. They’re my apps, folders, files, windows. Don’t tell me what to call them. Don’t direct me to click in a certain place to get them; I want them just there if there’s room, and if there isn’t, on a quick flick of the wrist to a whole screen edge, not a particular place followed by a click. It wastes a bit of precious menu-bar real-estate with a word that’s conceptually irrelevant to me. It’s something I have to remember to do.

That’s not saving me time or effort, it’s making me learn a new trick and do extra work.

The menu bar. Time-honoured UI structure. Shared by all post-Mac GUIs. Sometimes it contains a menu, efficiently spread out over a nice big easily-mousable spatial range. Sometimes that’s in the window; whatever. The whole width of the screen in Mac and Unity. A range of commands spread out.

On Windows, the centre of the title bar is important info — what program this window belongs to.

On the Mac, that’s the first word of the title bar. I read from left to right, because I use a Latinate alphabet. So that’s a good place too.

On GNOME 3, there’s some random word I don’t associate with anything in particular as the first word, then a deformed fragment of an icon that’s hard to recognise, then a word, then a big waste of space, then the blasted clock! Why the clock? Are they that obsessive, such clock-watchers? Mac and Windows and Unity all banish the clock to a corner. Not GNOME, no. No, it’s front and centre, one of the most important things in one of the most important places.

Why?

I don’t know, but I’m not allowed to move it.

Apple put its all-important logo there in early versions of Mac OS X. They quickly were told not to be so egomaniac. GNOME 3, though, enforces it.

On Mac, Unity, and Windows, in one corner, there’s a little bunch of notification icons. Different corners unless I put the task bar at the top, but whatever, I can adapt.

On GNOME 3, no, those are rationed. There are things hidden under sub options. In the pursuit of cleanliness and tidiness, things like my network status are hidden away.

That’s my choice, surely? I want them in view. I add extra ones. I like to see some status info. I find it handy.

GNOME says no, you don’t need this, so we’ve hidden it. You don’t need to see a whole menu. What are you gonna do, read it?

It reminds me of the classic Bill Hicks joke:

"You know I've noticed a certain anti-intellectualism going around this country ever since around 1980, coincidentally enough. I was in Nashville, Tennessee last weekend and after the show I went to a waffle house and I'm sitting there and I'm eating and reading a book. I don't know anybody, I'm alone, I'm eating and I'm reading a book. This waitress comes over to me (mocks chewing gum) 'what you readin' for?'...wow, I've never been asked that; not 'What am I reading', 'What am I reading for?’ Well, goddamnit, you stumped me... I guess I read for a lot of reasons — the main one is so I don't end up being a f**kin' waffle waitress. Yeah, that would be pretty high on the list. Then this trucker in the booth next to me gets up, stands over me and says [mocks Southern drawl] 'Well, looks like we got ourselves a readah'... aahh, what the fuck's goin' on? It's like I walked into a Klan rally in a Boy George costume or something. Am I stepping out of some intellectual closet here? I read, there I said it. I feel better."

Yeah, I read. I like reading. It’s useful. A bar of words is something I can scan in a fraction of a second. Then I can click on one and get… more words! Like some member of the damned intellectual elite. Sue me. I read.

But Microsoft says no, thou shalt have ribbons instead. Thou shalt click through tabs of little pictures and try and guess what they mean, and we don’t care if you’ve spent 20 years learning where all the options were — because we’ve taken them away! Haw!

And GNOME Shell says, nope, you don’t need that, so I’m gonna collapse it all down to one menu with a few buried options. That leaves us more room for the all-holy clock. Then you can easily see how much time you’ve wasted looking for menu options we’ve removed.

You don’t need all those confusing toolbar buttons neither, nossir, we gonna take most of them away too. We’ll leave you the most important ones. It’s cleaner. It’s smarter. It’s more elegant.

Well, yes it is, it’s true, but you know what, I want my software to rank usefulness and usability above cleanliness and elegance. I ride a bike with gears, because gears help. Yes, I could have a fixie with none, it’s simpler, lighter, cleaner. I could even get rid of brakes in that case. Fewer of those annoying levers on the handlebars.

But those brake and gear levers are useful. They help me. So I want them, because they make it easier to go up hills and easier to go fast on the flat, and if it looks less elegant, well I don’t really give a damn, because utility is more important. Function over form. Ideally, a balance of both, but if offered the choice, favour utility over aesthetics.

Now, to be fair, yes, I know, I can install all kinds of GNOME Shell extensions — from Firefox, which freaks me out a bit. I don’t want my browser to be able to control my desktop, because that’s a possible vector for malware. A webpage that can add and remove elements to my desktop horrifies me at a deep level.

But at least I can do it, and that makes GNOME Shell a lot more usable for me. I can customise it a bit. I can add elements and I could make my favourites bar be permanent, but honestly, for me, this is core functionality and I don’t think it should be an add-on. The favourites bar still won’t easily let me see how many instances of an app are running like the Unity one. It doesn’t also hold minimised windows and easy shortcuts like the Mac one. It’s less flexible than either.

There are things I like. I love the virtual-desktop switcher. It’s the best on any OS. I wish GNOME Shell were more modular, because I want that virtual-desktop switcher on Unity and XFCE, please. It’s superb, a triumph.

But it’s not modular, so I can’t. And it’s only customisable to a narrow, limited degree. And that means not to the extent that I want.

I accept that some of this is because I’m old and somewhat stuck in my ways and I don’t want to change things that work for me. That’s why I use Linux, because it’s customisable, because I can bend it to my will.

I also use Mac OS X — I haven’t upgraded to Sierra yet, so I won’t call it macOS — and anyway, I still own computers that run MacOS, as in MacOS 6, 7, 8, 9 — so I continue to call it Mac OS X. What this tells you is that I’ve been using Macs for a long time — since the late 1980s — and whereas they’re not so customisable, I am deeply familiar and comfortable with how they work.

And Macs inspired the Windows desktop and Windows inspired the Linux desktops, so there is continuity. Unity works in ways I’ve been using for nearly 30 years.

GNOME 3 doesn’t. GNOME 3 changes things. Some in good ways, some in bad. But they’re not my ways, and they do not seem to offer me any improvement over the ways I’m used to. OS X and Unity and Windows Vista/7/8/10 all give me app searching as a primary launch mechanism; it’s not a selling point of GNOME 3. The favourites bar thing isn’t an improvement on the OS X Dock or Unity Launcher or Windows Taskbar — it only delivers a small fraction of the functionality of those. The menu bar is if anything less customisable than the Mac or Unity ones, and even then, I have to use extensions to do it. If I move to someone else’s computer, all that stuff will be gone.

So whereas I do appreciate what it does and how and why it does so, I don’t feel like it’s for me. It wants me to change to work its way. The other OSes I use — OS X daily, Ubuntu Unity daily, Windows occasionally when someone pays me — don’t.

So I don’t use it.

Does that make sense?

10 most recent