?

Log in

Fri, May. 13th, 2016, 01:50 pm
The decline & fall of DEC - & MS

(Repurposed email reply)

Although I was educated & worked with DEC systems, I didn't have much to do with the company itself. Its support was good, the kit ludicrously expensive, and the software offerings expensive, slow and lacking competitive features. However, they also scored in some ways.

My 60,000' view:

Microsoft knew EXACTLY what it was doing with its practices when it built up its monopoly. It got lucky with the technology: its planned future super products flopped, but it turned on a dime & used what worked.

But killing its rivals, any potential rival? Entirely intentional.

The thing is that no other company was poised to effectively counter the MS strategy. Nobody.

MS' almost-entirely-software-only model was almost unique. Its ecosystem of apps and 3rd party support was unique.

In the end, it actually did us good. Gates wanted a computer on every desk. We got that.

The company's strategy called for open compatible generic hardware. We got that.

Only one platform, one OS, was big enough, diverse enough, to compete: Unix.

But commercial, closed, proprietary Unix couldn't. 2 ingredients were needed:

#1 COTS hardware - which MS fostered;
#2 FOSS software.

Your point about companies sharing their source is noble, but I think inadequate. The only thing that could compete with a monolithic software monopolist on open hardware was open software.

MS created the conditions for its own doom.

Apple cleverly leveraged FOSS Unix and COTS X86 hardware to take the Mac brand and platform forward.

Nobody else did, and they all died as a result.

If Commodore, Atari and Acorn had adopted similar strategies (as happened independently of them later, after their death, resulting in AROS, AFROS & RISC OS Open), they might have lived.

I can't see it fitting the DEC model, but I don't know enough. Yes, cheap low-end PDP-11s with FOSS OSes might have kept them going longer, but not saved them.

The deal with Compaq was catastrophic. Compaq was in Microsoft's pocket. I suspect that Intel leant on Microsoft and Microsoft then leant on Compaq to axe Alpha, and Compaq obliged. It also knifed HP OpenMail, possibly the Unix world's only viable rival to Microsoft Exchange.

After that it was all over bar the shouting.

Microsoft could not have made a success of OS/2 3 without Dave Cutler... But DEC couldn't have made a success out of PRISM either, I suspect. Maybe a stronger DEC would have meant Windows NT would never have happened.

Wed, Apr. 27th, 2016, 07:06 pm
Where did we all go wrong? And why doesn't anyone remember? [Tech blog post]

My contention is that a large part of the reason that we have the crappy computers that we do today -- lowest-common-denominator boxes, mostly powered by one of the kludgiest and most inelegant CPU architectures of the last 40 years -- is not technical, nor even primarily commercial or due to business pressures, but rather, it's cultural.

When I was playing with home micros (mainly Sinclair and Amstrad; the American stuff was just too expensive for Brits in the early-to-mid 1980s), the culture was that Real Men programmed in assembler and the main battle was Z80 versus 6502, with a few weirdos saying that 6809 was better than either. BASIC was the language for beginners, and a few weirdos maintained that Forth was better.

At university, I used a VAXcluster and learned to program in Fortran-77. The labs had Acorn BBC Micros in -- solid machines, the best 8-bit BASIC ever, and they could interface both with lab equipment over IEEE-488 and with generic printers and so on over Centronics parallel and its RS-423 interface [EDIT: fixed!], which could talk to RS-232 kit.

As I discovered when I moved into the professional field a few years later (1988), this wasn't that different from the pro stuff. A lot of apps were written in various BASICs, and in the old era of proprietary OSes on proprietary kit, for performance, you used assembler.

But a new wave was coming. MS-DOS was already huge and the Mac was growing strongly. Windows was on v2 and was a toy, but Unix was coming to mainstream kit, or at least affordable kit. You could run Unix on PCs (e.g. SCO Xenix), on Macs (A/UX), and my employers had a demo IBM RT-6150 running AIX 1.

Unix wasn't only the domain (pun intentional) of expensive kit priced in the tens of thousands.

A new belief started to spread: that if you used C, you could get near-assembler performance without the pain, and the code could be ported between machines. DOS and Mac apps started to be written (or rewritten) in C, and some were even ported to Xenix. In my world, nobody used stuff like A/UX or AIX, and Xenix was specialised. I was aware of Coherent as the only "affordable" Unix, but I never saw a copy or saw it running.

So this second culture of C code running on non-Unix OSes appeared. Then the OSes started to scramble to catch up with Unix -- first OS/2, then Windows 3, then the for a decade parallel universe of Windows NT, until XP became established and Win9x finally died. Meanwhile, Apple and IBM flailed around, until IBM surrendered, Apple merged with NeXT and switched to NeXTstep.

Now, Windows is evolving to be more and more Unix-like, with GUI-less versions, clean(ish) separation between GUI and console apps, a new rich programmable shell, and so on.

While the Mac is now a Unix box, albeit a weird one.

Commercial Unix continues to wither away. OpenVMS might make a modest comeback. IBM mainframes seem to be thriving; every other kind of big iron is now emulated on x86 kit, as far as I can tell. IBM has successfully killed off several efforts to do this for z Series.

So now, it's Unix except for the single remaining mainstream proprietary system: Windows. Unix today means Linux, while the weirdoes use FreeBSD. Everything else seems to be more or less a rounding error.

C always was like carrying water in a sieve, so now, we have multiple C derivatives, trying to patch the holes. C++ has grown up but it's like Ada now: so huge that nobody understands it all, but actually, a fairly usable tool.

There's the kinda-sorta FOSS "safe C++ in a VM", Java. The proprietary kinda-sorta "safe C++ in a VM", C#. There's the not-remotely-safe kinda-sorta C in a web browser, Javascript.

And dozens of others, of course.

Even the safer ones run on a basis of C -- so the lovely cuddly friendly Python, that everyone loves, has weird C printing semantics to mess up the heads of beginners.

Perl has abandoned its base, planned to move onto a VM, then the VM went wrong, and now has a new VM and to general amazement and lack of interest, Perl 6 is finally here.

All the others are still implemented in C, mostly on a Unix base, like Ruby, or on a JVM base, like Clojure and Scala.

So they still have C like holes and there are frequent patches and updates to try to make them able to retain some water for a short time, while the "cyber criminals" make hundreds of millions.

Anything else is "uncommercial" or "not viable for real world use".

Borland totally dropped the ball and lost a nice little earner in Delphi, but it continues as Free Pascal and so on.

Apple goes its own way, but has forgotten the truly innovative projects it had pre-NeXT, such as Dylan.

There were real projects that were actually used for real work, like Oberon the OS, written in Oberon the language. Real pioneering work in UIs, such as Jef Raskin's machines, the original Mac and Canon Cat -- forgotten. People rhapsodise over the Amiga and forget that the planned OS, CAOS, to be as radical as the hardware, never made it out of the lab. Same, on a smaller scale, with the Acorn Archimedes.

Despite that, of course, Lisp never went away. People still use it, but they keep their heads down and get on with it.

Much the same applies to Smalltalk. Still there, still in use, still making real money and doing real work, but forgotten all the same.

The Lisp Machines and Smalltalk boxes lost the workstation war. Unix won, and as history is written by the victors, now the alternatives are forgotten or dismissed as weird kooky toys of no serious merit.

The senior Apple people didn't understand the essence of what they saw at PARC: they only saw the chrome. They copied the chrome, not the essence, and now all that any of us have is the chrome. We have GUIs, but on top of the nasty kludgy hacks of C and the like. A late-'60s skunkware project now runs the world, and the real serious research efforts to make something better, both before and after, are forgotten historical footnotes.

Modern computers are a vast disappointment to me. We have no thinking machines. The Fifth Generation, Lisp, all that -- gone.

What did we get instead?

Like dinosaurs, the expensive high-end machines of the '70s and '80s didn't evolve into their successors. They were just replaced. First little cheapo 8-bits, not real or serious at all, although they were cheap and people did serious stuff with them because it's all they could afford. The early 8-bits ran semi-serious OSes such as CP/M, but when their descendants sold a thousand times more, those descendants weren't running descendants of that OS -- no, it and its creator died.

CP/M evolved into a multiuser multitasking 386 OS that could run multiple MS-DOS apps on terminals, but it died.

No, then the cheapo 8-bits thrived in the form of an 8/16-bit hybrid, the 8086 and 8088, and a cheapo knock-off of CP/M.

This got a redesign into something grown-up: OS/2.

Predictably, that died.

So the hacked-together GUI for DOS got re-invigorated with an injection of OS/2 code, as Windows 3. That took over the world.

The rivals - the Amiga, ST, etc? 680x0 chips, lots of flat memory, whizzy graphics and sound? All dead.

Then Windows got re-invented with some OS/2 3 ideas and code, and some from VMS, and we got Windows NT.

But the marketing men got to it and ruined its security and elegance, to produce the lipstick-and-high-heels Windows XP. That version, insecure and flakey with its terrible bodged-in browser, that, of course, was the one that sold.

Linux got nowhere until it copied the XP model. The days of small programs, everything's a text file, etc. -- all forgotten. Nope, lumbering GUI apps, CORBA and RPC and other weird plumbing, huge complex systems, but it looks and works kinda like Windows and a Mac now so it looks like them and people use it.

Android looks kinda like iOS and people use it in their billions. Newton? Forgotten. No, people have Unix in their pocket, only it's a bloated successor of Unix.

The efforts to fix and improve Unix -- Plan 9, Inferno -- forgotten. A proprietary microkernel Unix-like OS for phones -- Blackberry 10, based on QNX -- not Androidy enough, and bombed.

We have less and less choice, made from worse parts on worse foundations -- but it's colourful and shiny and the world loves it.

That makes me despair.

We have poor-quality tools, built on poorly-designed OSes, running on poorly-designed chips. Occasionally, fragments of older better ways, such as functional-programming tools, or Lisp-based development environments, are layered on top of them, but while they're useful in their way, they can't fix the real problems underneath.

Occasionally someone comes along and points this out and shows a better way -- such as Curtis Yarvin's Urbit. Lisp Machines re-imagined for the 21st century, based on top of modern machines. But nobody gets it, and its programmer has some unpleasant and unpalatable ideas, so it's doomed.

And the kids who grew up after C won the battle deride the former glories, the near-forgotten brilliance that we have lost.

And it almost makes me want to cry sometimes.

We should have brilliant machines now, not merely Steve Jobs' "bicycles for the mind", but Gossamer Albatross-style hang-gliders for the mind.

But we don't. We have glorified 8-bits. They multitask semi-reliably, they can handle sound and video and 3D and look pretty. On them, layered over all the rubbish and clutter and bodges and hacks, inspired kids are slowly brute-forcing machines that understand speech, which can see and walk and drive.

But it could have been so much better.

Charles Babbage didn't finish the Difference Engine. It would have paid for him to build his Analytical Engine, and that would have given the Victorian British Empire the steam-driven computer, which would have transformed history.

But he got distracted and didn't deliver.

We started to build what a few old-timers remember as brilliant machines, machines that helped their users to think and to code, with brilliant -- if flawed -- software written in the most sophisticated computer languages yet devised, by the popular acclaim of the people who really know this stuff: Lisp and Smalltalk.

But we didn't pursue them. We replaced them with something cheaper -- with Unix machines, an OS only a nerd could love. And then we replaced the Unix machines with something cheaper still -- the IBM PC, a machine so poor that the £125 ZX Spectrum had better graphics and sound.

And now, we all use descendants of that. Generally acknowledged as one of the poorest, most-compromised machines, based on descendants of one of the poorest, most-compromised CPUs.

Yes, over the 40 years since then, most of rough edges have been polished out. The machines are now small, fast, power-frugal with tons of memory and storage, with great graphics and sound. But it's taken decades to get here.

And the OSes have developed. Now they're feature-rich, fairly friendly, really very robust considering the stone-age stuff they're built from.

But if we hadn't spent 3 or 4 decades making a pig's ear into silk purse -- if we'd started with a silk purse instead -- where might we have got to by now?

Mon, Apr. 25th, 2016, 03:55 pm
Acorn: from niche to forgotten obscurity and total industry dominance at the same time

More retrocomputing meanderings -- whatever became of the ST, Amiga and Acorn operating systems?

The Atari ST's GEM desktop also ran on MS-DOS, DR's own DOS+ (a forerunner of the later DR-DOS) and today is included with FreeDOS. In fact the first time I installed FreeDOS I was *very* surprised to find my name in the credits. I debugged some batch files used in installing the GEM component.

The ST's GEM was the same environment. ST GEM was derived from GEM 1; PC GEM from GEM 2, crippled after an Apple lawsuit. Then they diverged. FreeGEM attempted to merge them again.

But the ST's branch prospered, before the rise of the PC killed off all the alternative platforms. Actual STs can be quite cheap now, or you can even buy a modern clone:

http://harbaum.org/till/mist/index.shtml

If you don't want to lash out but have a PC, the Aranym environment gives you something of the feel of the later versions. It's not exactly an emulator, more a sort of compatibility environment that enhances the "emulated" machine as much as it can using modern PC hardware.

http://aranym.org/

And the ST GEM OS was so modular, different 3rd parties cloned every components, separately. Some commercially, some as FOSS. The Aranym team basically put together a sort of "distribution" of as many FOSS components as they could, to assemble a nearly-complete OS, then wrote the few remaining bits to glue it together into a functional whole.

So, finally, after the death of the ST and its clones, there was an all-FOSS OS for it. It's pretty good, too. It's called AFROS, Atari Free OS, and it's included as part of Aranym.

I longed to see a merger of FreeGEM and Aranym, but it was never to be.

The history of GEM and TOS is complex.

Official Atari TOS+GEM evolved into TOS 4, which included the FOSS Mint multitasking later, which isn't much like the original ROM version of the first STs.

The underlying TOS OS is not quite like anything else.

AIUI, CP/M-68K was a real, if rarely-seen, OS.

However, it proved inadequate to support GEM, so it was discarded. A new kernel was written using some of the tech from what was later to become DR-DOS on the PC -- something less like CP/M and more like MS-DOS: directories, separated with backslashes; FAT format disks; multiple executable types, 8.3 filenames, all that stuff.

None of the command-line elements of CP/M or any DR DOS-like OS were retained -- the kernel booted the GUI directly and there was no command line, like on the Mac.

This is called GEMDOS and AIUI it inherits from both the CP/M-68K heritage and from DR's x86 DOS-compatible OSes.

The PC version of GEM also ran on Acorn's BBC Master 512 which had an Intel 80186 coprocessor. It was a very clever machine, in a limited way.

Acorn's series of machines are not well-known in the US, AFAICT, and that's a shame. They were technically interesting, more so IMHO than the Apple II and III, TRS-80 series etc.

The original Acorns were 6502-based, but with good graphics and sound, a plethora of ports, a clear separation between OS, BASIC and add-on ROMs such as the various DOSes, etc. The BASIC was, I'd argue strongly, *the* best 8-bit BASIC ever: named procedures, local variables, recursion, inline assembler, etc. Also the fastest BASIC interpreter ever, and quicker than some compiled BASICs.

Acorn built for quality, not price; the machines were aimed at the educational market, which wasn't so price-sensitive, a model that NeXT emulated. Home users were welcome to buy them & there was one (unsuccessful) home model, but they were unashamedly expensive and thus uncompromised.

The only conceptual compromise in the original BBC Micro was that there was provision for ROM bank switching, but not RAM. The 64kB memory map was 50:50 split ROM and RAM. You could switch ROMs, or put RAM in their place, but not have more than 64kB. This meant that the high-end machine had only 32kB RAM, and high-res graphics modes could take 21kB or so, leaving little space for code -- unless it was in ROM, of course.

The later BBC+ and BBC Master series fixed that. They also allowed ROM cartridges, rather than bare chips inserted in sockets on the main board, and a numeric keypad.

Acorn looked at the 16-bit machines in the mid-80s, mostly powered by Motorola 68000s of course, and decided they weren't good enough and that the tiny UK company could do better. So it did.

But in the meantime, it kept the 6502-based, resolutely-8-bit BBC Micro line alive with updates and new models, including ROM-based terminals and machines with a range of built-in coprocessors: faster 6502-family chips for power users, Z80s for CP/M, Intel's 80186 for kinda-sorta PC compatibility, the NatSemi 32016 with PANOS for ill-defined scientific computing, and finally, an ARM copro before the new ARM-based machines were ready.

Acorn designed the ARM RISC chip in-house, then launched its own range of ARM-powered machines, with an OS based on the 6502 range's. Although limited, this OS is still around today and can be run natively on a Raspberry Pi:

https://www.riscosopen.org/content/

It's very idiosyncratic -- both the filesystem, the command line and the default editor are totally unlike anything else. The file-listing command is CAT, the directory separator is a full stop (i.e. a period), while the root directory is called $. The editor is a very odd dual-cursor thing. It's fascinating, totally unrelated to the entire DEC/MS-DOS family and to the entire Unix family. There is literally and exactly nothing else even slightly like it.

It was the first GUI OS to implement features that are now universal across GUIs: anti-aliased font rendering, full-window dragging and resizing (as opposed to an outline), and significantly, the first graphical desktop to implement a taskbar, before NeXTstep and long before Windows 95.

It supports USB, can access the Internet and WWW. There are free clients for chat, email, FTP, the WWW etc. and a modest range of free productivity tools, although most things are commercial.

But there's no proper inter-process memory protection, GUI multitasking is cooperative, and consequently it's not amazingly stable in use. It does support pre-emptive multitasking, but via the text editor, bizarrely enough, and only of text-mode apps. There was also a pre-emptive multitasking version of the desktop, but it wasn't very compatible, didn't catch on and is not included in current versions.

But saying all that, it's very interesting, influential, shared-source, entirely usable today, and it runs superbly on the £25 Raspberry Pi, so there is little excuse not to try it. There's also a FOSS emulator which can run the modern freeware version:

http://www.marutan.net/rpcemu/

For users of the old hardware, there's a much more polished commercial emulator for Windows and Mac which has its own, proprietary fork of the OS:

http://www.virtualacorn.co.uk/index2.htm

There's an interesting parallel with the Amiga. Both Acorn and Commodore had ambitious plans for a modern multitasking OS which they both referred to as Unix-like. In both cases, the project didn't deliver and the ground-breaking, industry-redefiningly capable hardware was instead shipped with much less ambitious OSes, both of which nonetheless were widely-loved and both of which still survive in the form of multiple, actively-maintained forks, today, 30 years later -- even though Unix in fact caught up and long surpassed these 1980s oddballs.

AmigaOS, based in part on the academic research OS Tripos, has 3 modern forks: the FOSS AROS, on x86, and the proprietary MorphOS and AmigaOS 4 on PowerPC.

Acorn RISC OS, based in part on Acorn MOS for the 8-bit BBC Micro, has 2 contemporary forks: RISC OS 5, owned by Castle Technology but developed by RISC OS Open, shared source rather than FOSS, running on Raspberry Pi, BeagleBoard and some other ARM boards, plus some old hardware and RPC Emu; and RISC OS 4, now owned by the company behind VirtualAcorn, run by an ARM engineer who apparently made good money selling software ARM emulators for x86 to ARM holdings.

Commodore and the Amiga are both long dead and gone, but the name periodically changes hands and reappears on various bits of modern hardware.

Acorn is also long dead, but its scion ARM Holdings designs the world's most popular series of CPUs, totally dominates the handheld sector, and outsells Intel, AMD & all other x86 vendors put together something like tenfold.

Funny how things turn out.

Tue, Apr. 5th, 2016, 02:57 pm
The AmigaOS lives on! It's up to 4.1 now. But is there any point today?

I am told it's lovely to use. Sadly, it only runs on obscure PowerPC-based kit that costs a couple of thousand pounds and can be out-performed by
a £300 PC.

AmigaOS's owners -- Hyperion, I believe -- chose the wrong platform.

On a Raspberry Pi or something, it would be great. On obscure expensive PowerPC kit, no.

Also, saying that, I got my first Amiga in the early 2000s. If I'd had one 15y earlier, I'd probably have loved it, but I bought a 2nd hand
Archimedes instead (and still think it was the right choice for a non-gamer and dabbler in programming).

A few years ago, with a LOT of work using 3 OSes and 3rd-party disk-management tools, I managed to coax MorphOS onto my Mac mini G4.
Dear hypothetical gods, that was a hard install.

It's... well, I mean, it's fairly fast, but... no Wifi? No Bluetooth?

And the desktop. It got hit hard with the ugly stick. I mean, OK, it's not as bad as KDE, but... ick.

Learning AmigaOS when you already know more modern OSes -- OS X, Linux, gods help us, even Windows -- well, the Amiga seems pretty
weird, and often for no good reason. E.g. a graphical file manager, but not all files have icons. They're not hidden, they just don't have
icons, so if you want to see them, you have to do a second show-all operation. And the dependence on RAMdisks, which are a historical curiosity now. And the needing to right-click to show the menu-bar when it's on a screen edge.

A lot of pointless arcana, just so Apple didn't sue, AFAICT.

I understand the love if one loved it back then. But now? Yeeeeeeaaaaaah, not so much.

Not that I'm proclaiming RISC OS to be the business now. I like it, but it's weird too. But AmigaOS does seem a bit primitive now. OTOH, if they sorted out multiprocessor support and memory protection and it ran on cheap ARM kit, then yeah, I'd be interested.

Wed, Mar. 30th, 2016, 08:33 pm
This will get me accused of fanboyism (again), but like it or not, Apple shaped the PC industry.

I recently read that a friend of mine claimed that "Both the iPhone and iPod were copied from other manufacturers, to a large extent."

This is a risible claim, AFAICS.

There were pocket MP3 jukeboxes before the iPod. I still own one. They were fairly tragic efforts.

There were smartphones before the iPhone. I still have at least one of them, too. Again, really tragic from a human-computer interaction point of view.


AIUI, the iPhone originated internally as a shrunk-down tablet. The tablet originated from a personal comment from Bill Gates to Steve Jobs that although tablets were a great idea, people simply didn’t want tablets because Microsoft had made them and they didn’t sell.
Read more...Collapse )
Jobs’ response was that the Microsoft ones didn’t sell because they were no good, not because people didn’t want tablets. In particular, Jobs stated that using a stylus was a bad idea. (This is also a pointer was to why he cancelled the Newton. And guess what? I've got one of them, too.)

Gates, naturally, contested this, and Jobs started an internal project to prove him wrong: a stylus-free finger-operated slim light tablet. However, when it was getting to prototype form, he allegedly realised, with remarkable prescience, that the market wasn’t ready yet, and that people needed a first step — a smaller, lighter, simpler, pocketable device, based on the finger-operated tablet.

Looking for a role or function for such a device, the company came up with the idea of a smartphone.

Smartphones certainly existed, but they were a geek toy, nothing more.

Apple was bold enough to make a move that would kill its most profitable line — the iPod — with a new product. Few would be so bold.

I can’t think of any other company that would have been bold enough to invent the iPhone. We might have got to devices as capable as modern smartphones and tablets, but I suspect they’d have still been festooned in buttons and a lot clumsier to use.

It’s the GUI story again. Xerox sponsored the invention and original development but didn’t know WTF to do with it. Contrary to the popular history, it did productise it, but as a vastly expensive specialist tool. It took Apple to make it the standard method of HCI, and it took Apple two goes and many years. The Lisa was still too fancy and expensive, and the original Mac too cut-down and too small and compromised.

The many rivals’ efforts were, in hindsight, almost embarrassingly bad. IBM’s TopView was a pioneering GUI and it was rubbish. Windows 1 and 2 were rubbish. OS/2 1.x was rubbish, and to be honest, OS/2 2.x was the pre-iPhone smartphone of GUI OSes: very capable, but horribly complex and fiddly.

Actually, arguably — and demonstrably, from the Atari ST market — DR GEM was a far better GUI than Windows 1 or 2. GEM was a rip-off of the Mac; the PC version got sued and crippled as a result, so blatant was it. It took MS over a decade to learn from the Mac (and GEM) and produce the first version of Windows with a GUI good enough to rival the Mac’s, while being different enough not to get sued: Windows 95.

Now, 2 decades later, everyone’s GUI borrows from Win95. Linux is still struggling to move on from Win95-like desktops, and even Mac OS X, based on a product which inspired Win95, borrows some elements from the Win95 GUI.

Everyone copies MS, and MS copies Apple. Apple takes bleeding-edge tech and turns geek toys into products that the masses actually want to buy.

Microsoft’s success is founded on the IBM PC, and that was IBM’s response to the Apple ][.

Apple has been doing this consistently for about 40 years. It often takes it 2 or 3 goes, but it does.

  • First time: 8-bit home micros (the Apple ][, an improved version of a DIY kit.)

  • Second time: GUIs (first the Lisa, then the Mac).

  • Third time: USB (on the iMac, arguably the first general-purpose PC designed and sold for Internet access as its primary function).

  • Fourth time: digital music players (the iPod wasn’t even the first with a hard disk).

  • Fifth time: desktop Unix (OS X, based on NeXTstep).

  • Sixth time: smartphones (based on what became the iPad, remember).

  • Seventh time: tablets (the iPad, actually progenitor of the iPhone rather than the other way round).

Yes, there are too many Mac fans, and they’re often under-informed. But there are also far to many Microsoft apologists, and too many Linux ones, too.

I use an Apple desktop, partly because with a desktop, I can choose my own keyboard and pointing device. I hate modern Apple ones.

I don’t use Apple laptops or phones. I’ve owned multiple examples of both. I prefer the rivals.

My whole career has been largely propelled by Microsoft products. I still use some, although my laptops run Linux, which I much prefer.

I am not a fanboy of any of them, but sadly, anyone who expresses fondness or admiration for anything Apple will be inevitably branded as one by the Anti-Apple fanboys, whose ardent advocacy is just as strong and just as irrational.

As will this.

Mon, Mar. 21st, 2016, 02:57 pm
Confessions of a Sinclair fan

I'm very fond of Spectrums (Spectra?) because they're the first computer I owned. I'd used my uncle's ZX-81, and one belonging to a neighbour, and Commodore PETs at school, but the PET was vastly too expensive and the ZX-81 too limited to be of great interest to me.

I read an article once that praised Apple for bringing home computers to the masses with the Apple ][, the first home computer for under US$ 1000. A thousand bucks? That was fantasy winning-the-football-pools money!

No, for me, the hero of the home computer revolution was Sir Clive Sinclair, for bringing us the first home computer for under GB £100. A hundred quid was achievable. A thousand would have gone on a newer car or a family holiday.
Read more...Collapse )

Mon, Feb. 29th, 2016, 11:36 pm
Floppies and hard disks and ROMs, oh my! Or why early micros couldn't boot from HD

In lieu of real content, a repurposed FB comment, 'cos I thought it stood alone fairly well. I'm meant to be writing about containers and the FB comment was a displacement activity.



The first single-user computers started to appear in the mid-1970s, such as the MITS Altair. These had no storage at all in their most minimal form -- you entered code into their few hundreds of bytes of memory (not MB, not kB, just 128 bytes or so.)

One of the things that was radical is that they had a microprocessor: the CPU was a single chip. Before that, processors were constructed from lots of components, e.g. the KENBAK-1.

A single-user desktop computer with a microprocessor was called a microcomputer.

So, in the mid- to late-1970s, hard disks were *extremely* expensive -- thousands of $/£, more than the computer itself. So nobody fitted them to microcomputers.

Even floppy drives were quite expensive. They'd double the price of the computer. So the first mass-produced "micros" saved to audio tape cassette. No disk drive, no disk controller -- it was left out to save costs.

If the machine was modular enough, you could add a floppy disk controller later, and plug a floppy drive into that.

With only tape to load and save from, working at 1200 bits per second or so, even small programs of a few kB took minutes to load. So the core software was built into a permanent memory chip in the computer, called a ROM. The computer didn't boot: you turned it on, and it started running the code in the ROM. No loading stage necessary, but you couldn't update or change it without swapping chips. Still, it was really tiny, so bugs were not a huge problem.

Later, by a few years into the 1980s, floppy drives fell in price so that high-end micros had them as a common accessory, although still not built in as standard for most.

But the core software was still on a ROM chip. They might have a facility to automatically run a program on a floppy, but you had to invoke a command to trigger it -- the computer couldn't tell when you inserted a diskette.

By the 16-bit era, the mid-1980s, 3.5" drives were cheap enough to bundle as standard. Now, the built-in software in the ROM just had to be complex enough to start the floppy drive and load the OS from there. Some machines still kept the whole OS in ROM though, such as the Atari ST and Acorn Archimedes. Others, like the Commodore Amiga, IBM PC & Apple Macintosh, loaded it from diskette.

Putting it on diskette was cheaper, it meant you could update it easily, or even replace it with alternative OSes -- or for games, do without an OS altogether and boot directly into the game.

But hard disks were still seriously expensive, and needed a separate hard disk controller to be fitted to the machine. Inexpensive home machines like the early or basic-model Amigas and STs didn't have one -- again, it was left out for cost-saving reasons.

On bigger machines with expansion slots, you could add a hard disk controller and it would have a ROM chip on it that added the ability to boot from a hard disk connected to the controller card. But if your machine was a closed box with no internal slots, it was often impossible to add such a controller, so you might get a machine which later in its life had a hard disk controller and drive added, but the ROMs couldn't be updated so it wasn't possible to boot from the hard disk.

But this was quite rare. The 2nd ever model of Mac, the Mac Plus, added SCSI ports, the PC was always modular, and the higher-end models of STs, Amigas and Archimedes had hard disk interfaces.

The phase of machines with HDs but booting from floppy was fairly brief and they weren't common.

If the on-board ROMs could be updated, replaced, or just supplemented with extra ones in the HD controller, you could add the ability to boot from HD. If the machine booted from floppy anyway, this wasn't so hard.



Which reminds me -- I am still looking for an add-on hard disk for an Amstrad PCW, if anyone knows of such a thing!

Thu, Feb. 18th, 2016, 01:58 pm
Unix: the new legacy platform [tech blog post, by me]

Today, Linux is Unix. And Linux is a traditional, old-fashioned, native-binary, honking great monolithic lump of code in a primitive, unsafe, 1970s language.

The sad truth is this:

Unix is not going to evolve any more. It hasn't evolved much in 30 years. It's just being refined: the bugs are gradually getting caught, but no big changes have happened since the 1980s.

Dr Andy Tanenbaum was right in 1991. Linux is obsolete.

Many old projects had a version numbering scheme like, e.g., SunOS:

Release 1.0, r2, r3, r4...

Then a big rewrite: Version 2! Solaris! (AKA SunOS 5)

Then Solaris 2, 3, 4, 5... now we're on 11 and counting.

Windows reset after v3, with NT. Java did the reverse after 1.4: Java 1.5 was "Java 5". Looks more mature, right? Right?

Well, Unix dates from between 1970 and the rewrite in C in 1972. Motto: "Everything's a file."

Unix 2.0 happened way back in the 1980s and was released in 1991: Plan 9 from Bell Labs.

It was Unix, but with even more things turned into files. Integrated networking, distributed processes and more.

The world ignored it.

Plan 9 2.0 was Inferno: it went truly platform-neutral. C was replaced by Limbo, type-safe, compiling code down to binaries that ran on Dis, a universal VM. Sort of like Java, but better and reaching right down into the kernel.

The world ignored that, too.

Then came the idea of microkernels. They've been tried lots of times, but people seized on the idea of early versions that had problems -- Mach 1 and Mach 2 -- and failed projects such as the GNU HURD.

They ignore successful versions:
* Mach 3 as used in Mac OS X and iOS
* DEC OSF/1, later called DEC Tru64 Unix, also based on Mach
* QNX, a proprietary true-microkernel OS used widely around the world since the 1980s, now in Blackberry 10 but also in hundreds of millions of embedded devices.

All are proper solid commercial successes.

Now, there's Minix 3, a FOSS microkernel with the NetBSD userland on top.

But Linux is too established.

Yes, NextBSD is a very interesting project. But basically, it's just fitting Apple userland services onto FreeBSD.

So, yes, interesting, but FreeBSD is a sideline. Linux is the real focus of attention. FreeBSD jails are over a decade old, but look at the fuss the world is making about Docker.

There is now too much legacy around Unix -- and especially Linux -- for any other Unix to get much traction.

We've had Unix 2.0, then Unix 2.1, then a different, less radical, more conservative kind of Unix 2.0 in the form of microkernels. Simpler, cleaner, more modular, more reliable.

And everyone ignored it.

So we're stuck with the old one, and it won't go away until something totally different comes along to replace it altogether.

Fri, Feb. 5th, 2016, 06:51 pm
Why do Macs have "logic boards" while PCs have "motherboards"?

Since it looks like my FB comment is about to get censored, I thought I'd repost it...

-----

Gods, you are such a bunch of newbies! Only one comment out of 20 knows the actual answer.

History lesson. Sit down and shaddup, ya dumb punks.

Early microcomputers did not have a single PCB with all the components on it. They were on separate cards, and all connected together via a bus. This was called a backplane and there were 2 types: active and passive. It didn't do anything except interconnect other components.

Then, with increasing integration, a main board with the main controller logic on it became common, but this had slots on it for other components that were too expensive to include. The pioneer was the Apple II, known affectionately as the Apple ][. The main board had the processor, RAM and glue logic. Cards provided facilities such as printer ports, an 80 column display, a disk controller and so on.

But unlike the older S100 bus and similar machines, these boards did nothing without the main board. So they were called daughter boards, and the one they plugged into was the motherboard.

Then came the Mac. This had no slots so there could be no daughterboards. Nothing plugged into it, not even RAM -- it accepted no expansions at all; therefore it made no sense to call it a motherboard.

It was not the only PCB in the computer, though. The original Mac, remember, had a 9" mono CRT built in. An analogue display, it needed analogue electronics to control it. These were on the Analog Board (because Americans can't spell.)

The board with the digital electronics on it -- the bits that did the computing, in other words the logic -- was the Logic Board.

2 main boards, not one. But neither was primary, neither had other subboards. So, logic board and analog board.

And it's stuck. There are no expansion slots on any modern Mac. They're all logic boards, *not* motherboards because they have no children.

https://www.ifixit.com/Teardown/Macintosh+128K+Teardown/21422

Sat, Jan. 30th, 2016, 07:37 pm
Fallen giants - comparing the '80s second-generation home computers

A friend of mine who is a Commodore enthusiast commented that if the company had handled it better, the Amiga would have killed the Apple Mac off.

But I wonder. I mean, the $10K Lisa ('83) and the $2.5K Mac ('84) may only have been a year or two before the $1.3K Amiga 1000 ('85), but in those years, chip prices were plummeting -- maybe rapidly enough to account for the discrepancy.

The 256kB Amiga 1000 was half the price of the original 128kB Mac a year earlier.

Could Tramiel's Commodore have sold Macs at a profit for much less? I'm not sure. Later, yes, but then, Mac prices fell, and anyway, Apple has long been a premium-products-only sort of company. But the R&D process behind the Lisa & the Mac was long, complex & expensive. (Yes, true, it was behind the Amiga chipset, too, but less so on the OS -- the original CAOS got axed, remember. The TRIPOS thing was a last-minute stand-in, as was Arthur/RISC OS on the Acorn Archimedes.)

The existence of the Amiga also pushed development of the Mac II, the first colour model. (Although I think it probably more directly prompted the Apple ][GS.)

It's much easier to copy something that someone else has already done. Without the precedent of the Lisa, the Mac would have been a much more limited 8-bit machine with a 6809. Without the precedent of the Mac, the Amiga would have been a games console.


I think the contrast between the Atari ST and the Sinclair QL, in terms of business decisions, product focus and so on, is more instructive.
The QL could have been one of the imporant 2nd-generation home computers. It was launched a couple of weeks before the Mac.
But Sinclair went too far with its hallmark cost-cutting on the project, and the launch date was too ambitious. The result was a 16-bit machine that was barely more capable than an 8-bit one from the previous generation. Most of the later 8-bit machines had better graphics and sound; some (Memotech, Elan Enterprise) as much RAM, and some (e.g. the SAM Coupé) also supported built-in mass storage.
But Sinclair's OS, QDOS, was impressive. An excellent BASIC, front & centre like an 8-bit machine, but also full multitasking, modularity so it readily handled new peripherals -- but no GUI by default.
The Mac, similarly RAM deprived and with even poorer graphics, blew it away. Also, with the Lisa and the Mac, Apple had spotted that the future lay in GUIs, which Sinclair had missed -- the QL didn't get its "pointer environment" until later, and when it did, it was primitive-looking. Even the modern version is:



Atari, entering the game a year or so later, had a much better idea where to spend the money. The ST was an excellent demonstration of cost-cutting. Unlike the bespoke custom chipsets of the Mac and the Amiga, or Sinclair's manic focus on cheapness, Atari took off-the-shelf hardware and off-the-shelf software and assembled something that was good enough. A decent GUI, an OS that worked well in 512kB, graphics and sound that were good enough. Marginally faster CPU than an Amiga, and a floppy format interchangeable with PCs.
Yes, the Amiga was a better machine in almost every way, but the ST was good enough, and at first, significantly cheaper. Commodore had to cost-trim the Amiga to match, and the first result, the Amiga 500, was a good games machine but too compromised for much else.

The QL was built down to a price, and suffered for it. Later replacement motherboards and third-party clones such as the Thor fixed much of this, but it was no match for the GUI-based machines.

The Mac was in some ways a sort of cut-down Lisa, trying to get that ten-thousand-dollar machine down to a more affordable quarter of the price. Sadly, this meant losing the hard disk and the innovative multitasking OS, which were added back later in compromised form -- the latter cursed the classic MacOS until it was replaced with Mac OS X at the turn of the century.

The Amiga was a no-compromise games machine, later cleverly shoehorned into the role of a very capable multimedia GUI coomputer.

The ST was also built down to a price, but learned from the lessons of the Mac. Its spec wasn't as good as the Amiga, its OS wasn't as elegant as the Mac, but it was good enough.

The result was that games developers aimed at both, limiting the quality of Amiga games to the capabilities of the ST. The Amiga wasn't differentiated enough -- yes, Commodore did high-end three-box versions, but the basic machines remained too low-spec. The third-generation Amiga 1200 had a faster 68020 chip which the OS didn't really utilise, it had provision for a built-in hard disk which was an optional extra. AmigaOS was a pain to use with only floppies, like the Mac -- whereas the ST's ROM-based OS was fairly usable with a single drive. A dual-floppy-drive Amiga was the minimum usable spec, really, and it benefited hugely from a hard disk -- but Commodore didn't fit one.

The ST killed the Amiga, in effect. By providing an experience that was nearly as good in the important, visible ways, Commodore had to price-cut the Amiga to keep it competitive, hobbling the lower-end models. And as games were written to be portable between them both without too much work, they mostly didn't exploit the Amiga's superior abilities.

Acorn went its own way with the Archimedes -- it shared almost no apps or games with the mainstream machines, and while its OS is still around, it hasn't kept up with the times and is mainly a curiosity. Acorn kept its machines a bit higher-end, having affordable three-box models with hard disks right from the start, and focused on the educational niche where it was strong.

But Acorn's decision to go its own way was entirely vindicated -- its ARM chip is now the world's best-selling CPU. Both Microsoft and Apple OSes run on ARMs now. In a way, it won.

The poor Sinclair QL, of course, failed in the market and Amstrad killed it off when it was still young. But even so, it inspired a whole line of successors -- the CST Thor, the ICL One-Per-Desk (AKA Merlin Tonto, AKA Telecom Australia ComputerPhone), the Qubbesoft Aurora replacement main board and later the Q40 and Q60 QL-compatible PC-style motherboards. It had the first ever multitasking OS for a home computer, QDOS, which evolved into SMSQ/e and moved over to the ST platform instead. It's now open source, too.

And Linus Torvalds owned a QL, giving him a taste for multitasking so that he wrote his own multitasking OS when he got a PC. That, of course, was Linux.

The Amiga OS is still limping along, now running on a CPU line -- PowerPC -- that is also all but dead. The open-source version, AROS, is working on an ARM port, which might make it slightly more relevant, but it's hard to see a future or purpose for the two PowerPC versions, MorphOS and AmigaOS 4.

The ST OS also evolved, into a rich multitasking app environment for PCs and Macs (MagiC) and into a rich multitasking FOSS version, AFROS, running on an emulator on the PC, Aranym. A great and very clever little project but which went nowhere, as did PC GEM, sadly.

All of these clever OSes -- AROS, AFROS, QDOS AKA SMSQ/E. All went FOSS too late and are forgotten. Me, I'd love Raspberry Pi versions of any and all of them to play with!

In its final death throes, a flailing Atari even embraced the Transputer. The Atari ABAQ could run Parhelion's HELIOS, another interesting long-dead OS. Acorn's machines ran one of the most amazing OSes I've ever seen, TAOS, which nearly became the next-generation Amiga OS. That could have shaken up the industry -- it was truly radical.

And in a funny little side-note, the next next-gen Amiga OS after TAOS was to be QNX. It didn't happen, but QNX added a GUI and rich multimedia support to its embedded microkernel OS for the deal. That OS is now what powers my Blackberry Passport smartphone. Blackberry 10 is now all but dead -- Blackberry has conceded the inevitable and gone Android -- but BB10 is a beautiful piece of work, way better than its rivals.

But all the successful machines that sold well? The ST and Amiga lines are effectively dead. The Motorola 68K processor line they used is all but dead, too. So is its successor, PowerPC.

So it's the two niche machines that left the real legacy. In a way, Sinclair Research did have the right idea after all -- but prematurely. It thought that the justification for 16-bit home/business computers was multitasking. In the end, it was, but only in the later 32-bit era: the defining characteristic of the 16-bit era was bringing the GUI to the masses. True robust multitasking for all followed later. Sinclair picked the wrong feature to emphasise -- even though the QL post-dated the Apple Lisa, so the writing was there on the wall for all to see.

But in the end, the QL inspired Linux and the Archimedes gave us the ARM chip, the most successful RISC chip ever and the one that could still conceivably drive the last great CISC architecture, x86, into extinction.

Funny how things turn out.

10 most recent