?

Log in

No account? Create an account

Tue, Oct. 8th, 2019, 01:26 pm
What was the importance of Windows 2000?

(Another recycled Quora answer)

It was a pivotal release of the NT family of OSes.

It is forgotten now but Microsoft has had multiple tries at creating operating systems. In the very early 1980s, it had its own UNIX, called Xenix, which it offered for multiple computer platforms including the Apple Lisa.

Xenix failed.

Then there was MS-DOS 4, an attempt to create a multitasking DOS. This failed, and was replaced with IBM’s PC DOS 4, which was a very simple enhancement of MS-DOS 3.3, supporting larger hard disk partitions and adding a simple graphical program launcher called DOSshell.

Then there was OS/2, co-developed with IBM, designed from the ground up to be a multitasking, networked OS. Unfortunately, IBM crippled it, by insisting that the new OS could run on 80286 computers, because IBM had sold thousands of 80286-based PS/2 computers and promised its customers that they would one day be able to run OS/2.

The customers didn’t care — they ran PC DOS on the machines and were happy. OS/2 should have targetted the new 80386 processor, which would have made it able to multitask DOS applications. But it didn’t, so it failed.

A desperate Microsoft adopted an unofficial back-room skunkworks project to improve the commercial failure that was Windows 2. This was called Windows 3 and it was a huge success, but it ran on top of the very limited MS-DOS. It was a technical triumph that Windows 3 worked as well as it did.

Read more...Collapse )

Wed, Oct. 2nd, 2019, 10:31 am
Is Firefox the same program as Netscape? Specifically, Netscape Navigator?

Note, there were 2 products:
• Netscape Communicator: full suite
• Netscape Navigator: just a browser

Mozilla was always the codename for the product while it was in development.

Netscape started out as just a browser. Then it gained email -- see Zawinski's Law: https://en.wikiquote.org/wiki/Jamie_Zawinski

Then it gained web editing. Then it gained calendaring when Netscape Corp bought Collabora.

Netscape was driving to bankruptcy by Microsoft, which gave away IE for free in order to, quote, "knife Netscape in the back" (S Ballmer).

SUN bought the server software. AOL bought the client software and the dying Netscape Corp made future versions open source.

Note, the current version (Netscape 4.x) was not made FOSS and never was. Only the unfinished future Netscape 5.x version.

AOL owned the name "Netscape" so the new FOSS project couldn't be called that. So it went back to its old codename: Mozilla, the Godzilla of Mosaics. (Mosaic was the original GUI web browser.)

It was not finished and most of the employees had been laid off, while the new owners, AOL, actually used and bundled IE as the browser with their client software. (If they did not, Microsoft said it would not bundle AOL with Windows 95 & 98. More illegal restraint of trade; never prosecuted.)

It took a long time to finish Mozilla 5.

Occasionally, as it got usable, AOL took a snapshot, packaged it as a branded free product, and badged it Netscape.

Netscape 6 was Mozilla 0.6 and so on:
https://en.wikipedia.org/wiki/Netscape_6

This was Communicator not Navigator. The whole suite, with email, address book, web editor, etc.

Navigator was never open-sourced.

The Mozilla Applications Suite became the default web browser on most Linux distros, but never became a hit on Windows or Mac. Part of the reason being that most OSes came with email & chat clients anyway, and few turn-of-the-century web users wanted or needed the web page editor.

A few years later, wanting to regain some of that old success, a team within Mozilla produced a new, cut-down browser-only program. It was called Mozilla Phoenix: the program that rose from the ashes of Netscape.

Snag is, there are other software products called Phoenix. Someone sued.

So it was renamed Mozilla Firebird. The phoenix is the fire-bird.

But there's another FOSS program called Firebird  (a database).

Read more...Collapse )

Wed, Sep. 11th, 2019, 12:49 pm
A brief history of "Office Suites" versus integrated apps

MS Office was something new when it was launched.

Before MS Office, the market-leading apps in each sector tended to be from different vendors. E.g. in the late MS-DOS era, the leading apps were:

  • Spreadsheet: Lotus 1–2–3
  • Word processor: WordPerfect
  • Database: Ashton-Tate dBase IV
  • Presentation program: Harvard Graphics

The shift to Windows allows MS to get the upper hand and have competitive apps in all these sectors. To understand this you must understand that the plan was that MS-DOS would be replaced by OS/2, co-written by Microsoft with IBM. Big vendors such as Lotus and WordPerfect put a lot of effort into new OS/2 versions.

But OS/2 flopped, because at IBM’s insistence, OS/2 1.x ran on the 16-bit 80286 CPU. The 286 could not effectively multitask DOS apps, and as a result, neither could OS/2 1.x — or even offer very good DOS compatibility. This needed the 32-bit 80386 CPU — the origin of the name “x86”.

When OS/2 1.x flopped, Microsoft made a last-ditch effort to revive its failed Windows product on top of DOS. Windows 3 was a surprise hit. MS was not expecting it — when Windows 3 came out, the only major app Microsoft offered for its own GUI was Excel, which had been ported from the Mac to Windows 2. 

But MS pivoted quickly, hastily wrote a wordprocessor for Windows — Word for Windows 1 — and ported its Mac presentation program Powerpoint to Windows.

Read more...Collapse )

Fri, Aug. 9th, 2019, 02:11 pm
If Linux is so big then how come the desktop isn't developing?

Linux is big business now. Mostly on servers.

The only significant user-facing ones are Android and ChromeOS, which are both dramatically constrained systems, which is part of why they've been successful. It is taking desktop distro vendors way too long to catch up with what they are doing, but distros like Endless, and to a lesser extent Fedora Silverblue, are showing the way:

  • all apps containerised; no inter-app dependencies at all

  • OS image shipped as a complete, tested image

  • Most of the filesystem is read-only

  • no package manager, no end-user ability to install/remove/update packages. You get a whole new OS image periodically, like on a phone.

  • OS updates are transactional: it deploys the whole thing, and if it doesn't work, it rolls back the entire OS to the last known good snapshot. 2+ OS snapshots are maintained at any time so there should always be a good one.


This is a good thing. Software bloat is vast now. OSes are too complex for most people to understand, maintain, or fix. So you don't. Even the ability is removed.

This is in parallel with server deployments:

  • everything is virtualised: OSes only run in VMs with standardised hardware, the network connections are virtualised, the disks are virtualised.

  • VMs are built from templates and deployed automatically as needed, and destroyed again as needed.

  • there is as little as possible local state in any VM. It gets its state info automatically from a database over the network. The database is in a VM too, of course.

  • as few local config files as possible; config is kept in a database too and pushed out to local database instances


I could go on.

Unix is a late-1960s OS designed for late-1960s minicomputers:

  • big standalone non-networkerd servers with lots of small disks, shared by multiple interactive users on dumb text terminals

  • users built their own software from source

  • everything is a text file. Editors and piping are key tools.


With some 1970s tech on top that the industry spent 25 years getting working stably:

  • framebuffers and hi-res graphic displays are possible but very expensive

  • so, design for graphical terminals, or micros that are dedicated display servers

  • programs run over the network, executing on 1 machine, displaying on another

  • Ethernet networking has been bolted on. TCP/IP is the main protocol.

  • because GUIs and networking are add-ons, they break the "everything is a file" model. This is ignored. Editors etc do not allow for it yet alone use it.

  • machines treat one another as hostile. There is no federation, no process migration, etc.


Then in the 1980s this moribund minicomputer OS got a 2nd lease of life and started selling well because microcomputers got powerful enough to run it, growing up into expensive high-power workstations:

  • some effort at network integration: tools were bolted on top for distributing text-only config files automatically, machines could query each other to find resources

  • encryption was added for moving stuff over untrusted networks

  • a lot of focus on powerful programming tools and things like maths tools, 3D modelling tools

  • very little focus on user-friendliness or ease of use, as that sector was dominated by Macs, Amigas etc.

  • much of this stuff is proprietary because of the nature of the business model.

  • server support is half-hearted as there are dedicated server OSes for that


In the 1990s things changed again:

  • plain cheap PCs became powerful enough to run Unix usefully

  • the existing vendors flailed around trying to sell it but mostly failed as they kept their very expensive pricing models from the workstation era

  • FOSS re-implementations replace it, piggybacking on tech developed for Windows

  • After about 1½ decades of work, the leading FOSS *nix becomes a usable desktop OS. Linux wins. FreeBSD trails, but has some good work -- much of this goes into Mac OS X


Early 21st century:

  • high-speed Internet access can be assumed

  • non-technical end-users become a primary "market"

  • now it runs on local 64-bit multi-CPU micros with essentially infinite disk

  • it has a local 3D accelerator for a display


Results...

  • traditional troubleshooting/fault finding is obsolete. No need for keeping admin tools separate from user tools, no need for /bin and /sbin, /usr/bin and /usr/sbin, etc. Boot off a DVD or a USB, recover user data if any, nuke the OS and reload.

  • GUIs favour 3D chrome. When harmony is achieved & everyone standardises on GNOME 2, Microsoft attacks it and destroys it, resulting in vast duplication of desktop functionality and a huge amount of wasted effort.

  • Because of poor app portability between distros, just like in the days of proprietary Unix, only a few big-name apps exist for all distros.

  • Linux is mainly only usable for Web/email/chat/simple office stuff, and traditional coder work. Windows and Mac hoover up all of the rich-local-apps market, including games. Linux vendors do not even notice.

  • Linux on conventional desktops/laptops is weak, but that market is shrinking fast. But...

  • Not-really-Linux-any-more phone/tablet OSes are thriving

  • Consumer Internet use is huge, for content consumption, social networking, and retail


This drives a need for vast server farms, with the lowest possible unit software cost.

  • tools for automation -- for deployment, management, scaling -- are big money

  • because the job market is huge, skill levels are relatively low, so automated distribution of workloads is key:

  • - tools for deploying & re-deploying VM images automatically in case of failure of the contained app

  • - tools for large teams to interwork on incremental, iterative software development

  • - bolting together existing components, automated building and testing and packaging and deployment

  • as the only significant successful end-user apps are web browsers, all tools move onto the web platform:

  • - web mail, web chat,  web media, web file storage, web config management

  • Result: tooling written in Web tools -- JavaScript -- displaying over Web UIs (browser rendering engines)

  • On the server end, inefficiency can be solved by deploying more servers. They're cheap, the software is free.

  • On the client end, most focus is on fast browsers and using games acceleration hardware to deliver fast web browsing, media playback, and hardware accelerated UI


So the only possible method of fighting back and trying to deliver improved end-user tooling for power users is to use a mixture of web tools and games hardware.

Result: OSes that need 3D OpenGL compositing, with desktops and apps written in JavaScript, and packaging and deployment methods taken from those designed for huge server farms.

  • GNOME 3 and Cinnamon, and a distant 3rd, KDE. (The only others are principally defined by refusal to conform.)

  • Flatpak, Snappy and a distant 3rd, Appimage

  • systemd and an increasing move away from text files, including for config and logging -- server farm tools use database connections, because in the 1980s & 1990s, nobody saw any reason to try to copy Microsoft's LAN Manager, domains, Novell NDS, Banyan VINES' Streetalk, or any other more sophisticated LAN management tools.


Gosh. That turned into quite a rant.

Anyway. The Linux desktop is going to continue to move away from familiar *nix ways because they are historical now. Because the Linux desktop is only a tiny parasite on the flank of the vast Linux server market, it gets tooling designed for that.

If you want a more traditional Unix experience, try FreeBSD. It's thriving off the move to systemd and so on.

Tue, Aug. 6th, 2019, 02:23 pm
Combating some myths about Windows' origins & shipping media

Another Quora answer. Someone is wrong on the Internet!

Your history and your memories are both incorrect.

MS Windows 1 was released in 1985: Windows 1.0 - Wikipedia

It did not resemble GEM. MS worked closely with Apple and had designed Windows as a tiling window interface, with no desktop, no drive icons and no other features to resemble MacOS, which had been released the year before.

If you look at it you will see next to no resemblance: GUIdebook > Screenshots > Windows 1.01

Furthermore, GEM is not an Atari product. GEM was written by Digital Research and released on the PC before it was ported to the ST: Graphics Environment Manager - Wikipedia

Additionally the Atari ST was not only a games computer; perhaps its primary long-term market success was as a music sequencer, due to built-in MIDI ports. STs were still used for this well into this century. Here are some accounts: Red Bull Music Academy Daily

The Band Atari Teenage Riot were named after the machine for this reason. The musician Alec Empire still uses one. I have seen both, and I still own an ST. Have or do you?

GEM did closely resemble MacOS, Apple sued and won, and PC GEM was crippled so it did not look so Mac-like. Compare here:

GEM 2.0

No overlapping windows — tiled instead. No desktop drive icons.

The lawsuit did not affect the Atari version.

Atari TOS 1.0

GEM is now FOSS and the Mac-like features have been restored: Screenshots of FreeGEM

It does not “look like X-windows”. There are 2 primary reasons.


  1. There is no such thing as “X-Windows”. It is The X Window System, so called because it followed the W Window System.
    W Window System - Wikipedia
    It was called W because it ran on top of, i.e. came after, V:
    V (operating system) - Wikipedia
    There is not and never has been a product called “X-Windows”. The current version of X is version 11, so it is usually called X.11. The reference implementation for x86 PCs is run by the FreeDesktop foundation, whose website is X.Org so it is often called X.org.
    Decades ago they spent a lot of money on trying to teach people not to call it “X-Windows”. That was never the name.

  2. X imposes no look and feel. It just just a system for drawing windows on the screen and putting contents in them. Every X.11 environment looks different. Look at the early version with twm in the Wikipedia article and you will see it’s nothing like MS Windows. Or compare to SunOS:
    SunView - SunOS 3.5
    The later Motif toolkit looks a little like Windows, with similar controls, because it was licensed from Microsoft, so that it would be familiar to use.
    GUIdebook > Screenshots > CDE 1.5 in Solaris 9

I deployed Windows for Workgroups in production in 1992. It did come on floppies.

The next year, I replaced some of the nodes on the networks with early Pentium computers running Windows NT 3.1. It was shipped on CD. You can download CD images here if you wish: Windows NT 3.x 3.1

It looked like this:

Again, I have one. And 95, 95B, 98, 98SE, ME, NT 3.51, NT 4, and Windows 2000. Do you?

There were editions available, at extra cost, on floppies, yes, but as even NT 3.1 in 1993 took over 30 floppies, it was not a popular option.

NT 3.51 Workstation was 150 MB. You can look at the downloads for yourself here:

Windows NT 3.x 3.51

Since a high-density 3½” floppy diskette stores 1.4 MB, that means about 100 floppy disks. Nobody used this if they had a choice. You remember incorrectly if you think it came on 11 disks; it took 3 just to boot a text-mode installer!

Windows 95 shipped on CD by default. It looked like this:

Windows 95B, which added USB support, also came on CD:

Again, yes, floppies were available, or you could make your own, but it took a lot and was very cumbersome indeed.

As you can see from the label, even if you bought a PC with it pre-installed, you got the CD. You did not normally get floppies because there were so many of them it was too expensive to duplicate and ship them all.

Note that both NT 3 and Windows 9x came with boot floppies, because add-on CD-ROM drives on PCs were not usually bootable at this time.

So you booted the PC of floppies, loaded the CD-ROM device drivers into MS-DOS (for Win9x) and then accessed the CD and ran SETUP. This may be what you are thinking about.

NT had 3 boot floppies, to load the kernel, then some essential drivers, then the Setup program.

Win9x had just one and indeed the OS contained an image of a bootable floppy and could write it to disk for you. You can download that here:

Bootdisk.Com

You seem to be working from some very vague and patchy memories. Perhaps you were very young at the time.

I was not. I was a year into my first job in IT when Windows 3.0 was released. I correctly predicted that it would be a huge hit. The company did not believe me and refused to stock up.

Suffice to say that within a few years the company no longer existed.

I worked with this stuff as an adult professional. It was my stock-in-trade. I kept copies of stand-out highlight products.

I know whereof I speak.


Tue, Aug. 6th, 2019, 01:44 pm
How and why MS WinWord defeated WordPerfect

(Another recycled Quora answer.)

Multiple reasons. In no particular order:


  • Cross-platform support.
    WordPerfect was a highly-optimised, cross-platform text-mode app. It ran on everything from Macs, DOS, Xenix, Atari ST, Amiga, VAX-VMS, Data General — all the mid-to-late 1980s OSes.
    As a company, WordPerfect Corp missed that soon, Windows would be the dominant platform. It did not give it enough priority.
    Compare with Lotus, which devoted its effort to 1–2–3 for OS/2 and missed the market shift to Windows 3.
    This resulted in a poor Windows version: slow, buggy, with a poor UI. This got fixed in time.

  • Printer Drivers.
    Pre-GUI OSes did not have a single central driver mechanism or printing subsystem. Every app had to provide its own. WP had the biggest and best. It could drive every printer on the market, natively, and get the best from it.
    Additionally, graphical OSes managed fonts, and screen fonts became printer fonts too.
    On Windows and Mac this was irrelevant. The OS drove the printer, not the app, and text was rendered and printed in graphics mode. WP’s vast driver database and sophisticated font support became completely irrelevant and indeed a maintenance problem for the company.

  • User interface.
    As a very cross-platform app, WP largely ignored the underlying OS’s UI and imposed its own, weird, tricky but very powerful UI. All leading DOS apps did this: it was a mark of pride to memorise multiple ones.
    Windows and MacOS swept this away with a new, standardised UI and editing model, at odds with WP’s.
    See: CUA — IBM Common User Access - Wikipedia
    WP tried to maintain both, side-by-side. This sort of worked but the emphasis on the old system alienated GUI users.

  • Cost.
    WP was an expensive, standalone app. It became its maker’s sole product: the DataPerfect database, WordPerfect Editor plain-text editor, LetterPerfect cut-down word processor, WordPerfect Library menuing system & DOS utilities, all fell by the wayside. Satellite Software even renamed itself to WordPerfect Corporation.
    Word for Windows was good enough for most people, but the cheap way to buy WinWord was as part of the MS Office bundle.
    WordPerfect Corp had no such bundle. It only did wordprocessors. MS Office was far cheaper than buying a market-leading word processor (e.g. WordPerfect) plus a market-leading spreadsheet (e.g. Lotus 1–2–3) plus a market-leading database (e.g. dBase IV), etc.
    In the end, Novell bought WordPerfect, bundled it with other purchases, such as Borland’s QuattroPro spreadsheet and Paradox database. It was not enough and the apps did not integrate any better than any other random Windows apps. So Novell sold the suite off to Corel, which has made a modest success selling the bundle.
    Corel did a deal with Microsoft to integrate MS Visual BASIC for Applications as the suite’s macro language, and adopt the MS Office look and feel — not realising that MS changed the look and feel of Office with every new version, to keep it looking fresh. A term of this deal was killing the native Linux WordPerfect (a superb app and probably the best Linux word-processor ever written), and the forthcoming port of the entire WordPerfect Office suite to Linux.
    This was the end of cross-platform WordPerfect, the Mac version already being dead — a superb classic MacOS app, it was never updated for Mac OS X.

Thu, Aug. 1st, 2019, 06:09 pm
A Tale of Two (and a half) PC Emulators.

So, once upon a time, there was a software PC emulator for the Mac. That's old PowerMacs running classic MacOS.

It was called SoftPC, by a British company called Insignia. SoftPC was a PC emulator for non-x86 computers. Unix workstation with RISC chips, basically. Some of the early RISC workstations were so much faster than PCs, you could run a usable emulation of a DOS PC and so run a few DOS apps.

It grew up to be a package called SoftWindows -- you can download it for free these days.

A bit later, the Acorn Archimedes came out -- a home computer fast enough to do the same thing. Acorn wrote their own, called, appropriately, "PC Emulator". Here's the manual [PDF], a compatibility list, and a contemporary write-up. (The latter is mainly about a follow-on product, but my original Archie was too low-spec to run that.)

I used it to take work home with me from my first ever job. The emulation gave me a slow PC but with very fast graphics and disk. It was certainly usable.

Later Insignia ported SoftPC to the Mac when PowerMacs became as powerful as the early UNIX machines (but 10× cheaper.)  SoftWindows was SoftPC enhanced with emulated (native Mac binary) device drivers to make Windows (and only Windows) run quicker. But since Windows is mainly what people needed, it did OK.

Fun fact: RISC versions of Windows NT (for MIPS, Alpha and PowerPC) ran 16-bit DOS apps and Win16 binaries via a licensed, embedded version of the Insignia SoftPC technology.

SoftWindows did so well that pioneering Mac vendor Connectix wrote their own version, Virtual PC. They'd already done other emulators so a PC one didn't seem so hard.

SoftWindows and Virtual PC were the two main rival products for Mac users who wanted occasional access to PC programs.

When VMware released their eponymous product, Connectix paid close attention.

VMware worked by trapping Ring 0 code (kernel code, stuff that directly manipulated the hardware) and running it through a CPU emulator -- on the native PC. This enabled x86 PCs to run virtualised x86 PCs. Before then, this needed special hardware (dedicated CPU instructions for virtualisation) that SPARC and POWER had but the x86 didn't. Indeed, the pundits had said it was impossible on x86.

Connectix thought "huh, we have a PC emulator already. We can do that." So they ported VirtualPC to the real PC. It was cheaper and easier to use than VMware.

Source: me. I interviewed the founder of Connectix, Jon Garber. He flew to the UK to meet me personally. Fun times.

As virtualisation took off, Intel added hardware virtualisation instructions to its chips. AMD did the same.

So the software emulators weren't needed any more -- it was much simpler to write one using the hardware facilities. That's exactly what KVM on Linux is.

But you need something to create the VM, manage virtual disks etc.

KVM uses the existing QEMU emulator for this.

Microsoft decided it wanted a hypervisor, so it bought Connectix and used those bits of VirtualPC. The rest was made a free download -- it's what runs XP Mode for Windows 7.

Microsoft Hyper-V is VirtualPC, integrated into Windows and minus the emulation engine that's no longer needed.

So, at different times and in different versions of the same product, Microsoft licensed and incorporated both SoftPC and VirtualPC.

Wed, Jul. 31st, 2019, 06:35 pm
Retro geek level: out of this world. [Tech blog post]

For a couple of weeks, since the 50th anniversary of Apollo 11 taking off, I've been riveted by "Curious Marc" Verdiell's Youtube channel. This isn't the first time -- his vlog of restoring a Xerox Alto was fascinating. But this project is even more historically significant: to get an original Apollo Guidance Computer running for the first time in about 45 years.

The AGC was all kinds of "first": the first computer made from integrated circuits; the first portable computer; the first computer to fly; the first computer on which humans landed on the moon.

Nonetheless I'm surprised to see the vlog even made the The Wall Street Journal.

If you don't know anything about the AGC, here's a f

antastic, very dense 1hr talk about it works.



Here's the Youtube playlist of all the restoration process.

And here's a link to the story of Margaret Hamilton, the team lead on the project of programming the AGC. You might recognise the rather famous photo of her standing next to a printout of the software, which is slightly taller than she is:
Image result for margaret hamilton agc

A fun detail of the software development process: not only was the machine extremely resource-constrained, and human lives depended on it -- so, no pressure then (!) -- but you must also consider the storage medium: core rope.

Core rope memory is not the same as core store. Core store uses tiny ferrite rings arranged on the intersections of very fine wires. By putting a current through both wires, the magnetic alignment of the core at their crossing-point could be read. But read destructively -- the act of reading it, erased it. Conversely, if the computer was off, the cores held their data indefinitely. People restoring 50 and 60 year old computers today can read what was in their core-store the last time they were turned off!

But core rope is different. It still uses cores, but big ones. Long wires thread in and out of cores, and the position of the wires encodes the data. So it's non-volatile: it's a kind of early ROM. You can never change the data. Ever. What was woven in when it was made is there forever. The phenomenally labour-intensive act of making it encodes the software: so weaving it was an extremely skilled task, given to experts... factories full of little old ladies, basically. This is software that is hand-knitted into the hardware. So after it's made, you can't change a single bit. The entire, multi-thousand-component hand-made rope must be re-woven.

This is CuriousMarc's playlist of the Xerox Alto restoration. The Alto was also a hugely signficant computer: the first GUI personal workstation, the machine on which the modern GUI was invented, the machine on which the pioneering object-oriented Smalltalk language was developed, and the first machine with Ethernet which more or less invented the idea of the Local Area Network. Some of original team came to admire the restoration process and help out -- and several of them are now dead.

The Alto is the machine that Steve Jobs & his team saw that led them to built the Lisa and then the Mac. They saw 3 things -- object-oriented programming, local-area networking & the GUI. Jobs himself said he fixated on the GUI and missed the (arguably, long-term) more important bits.

Source: the man himself.


This really is the last possible time to restore some of this stuff — while at least some of the creators are still alive.

Tue, Jul. 30th, 2019, 02:20 pm
What were the problems of using early computers? [Tech blog post]

Another Quora answer.

I can’t say. My family was not rich enough to afford such high-end computers that cost £thousands. Only Americans could.

In early-1980s Britain we had Sinclair, Commodore and Oric computers (e.g. the ZX Spectrum or C64.) The better-off had Acorn machines. (There were many other more obscure brands.)

Common problems?

Well, mass storage was too expensive for children & home users. No floppy disks. Programs were stored on cassette tapes and loaded at 1200 baud or less. Loading a game could take 5 or 10 minutes.

It was common for computer magazines to print listings for you to type in yourself. This is how I learned programming. A big program could take days to type in, so an ever-present danger was the computer overheating and crashing, or someone accidentally unplugging it, and you losing all that work.

You saved to tape periodically. This could take 5–10 min again. The computers used ordinary audio cassette players. That means no automated control. No seek function. No directory listings. One program per side, and lots of hand-labelled tapes.

Audio tape is not a reliable medium. You could save hours of work and have it refused to load the next day.

Even professionally-duplicated tapes suffered this, especially if you played the game a lot so the tape got worn. “Tape loading errors” were a common nightmare.

Some manufacturers offered optional disk controllers for more serious users, e.g. adults with more money. However, every make and model had its own disk format: a Commodore 64 could not not read disks from a BBC Micro, and neither could read disks from a PC. Commodore disk drives used a serial interface and so were excruciatingly slow.

Sinclair aimed at the budget end of the market and invented its own medium, the Sinclair Microdrive: ZX Microdrive - Wikipedia

This was a form of stringy floppy: Exatron Stringy Floppy - Wikipedia

Also derived from an audio medium, as the mass market made the tech cheaper. In this case, 8-track cassettes: 8-track tape - Wikipedia

I had these before I saved up for a disk interface and a single 5¼” drive as a university student. Each microdrive cartridge stored under 100 kB. Access took tens of seconds, but was still an order of magnitude or more faster than cassettes, which took tens of minutes.

They were slow, small, unreliable, and failure-prone, but better than anything else for the price.

As these machines were very slow, and lacked enough storage to usefully run compilers, to get enough performance for games, programmers worked in machine code. Magazines published these too. This might mean typing in 4, 5 or 6 pages of numbers:

So instead of typing in this, which was at least meaningful and could be followed:

You had to type in pages of this:

Your Computer (David Horne’s ZX-81 1K Chess, February 1983.)

This is a notably short program: only 3 pages or so. It plays chess in 1000 bytes of total space, a notable achievement that is famous: 1K ZX Chess - Wikipedia

Try to imagine typing in 30–40 thousand characters of code, where a single mistaken character renders the entire thing useless. When buying a new game might cost £10 or £15, an amount of money that could take 6 months to save up, a week of evenings after school spent typing was worth doing.

This, note, on terrible keyboards that resembled a cheap pocket calculator:

No space bar. No cursor keys or delete key. Each key performing 5–6 different functions depending on which other keys were held down.

This is the machine I learned to code on; I spent years typing on this exact keyboard.

No hard disk. No floppy disks. No directly-accessible storage. Everything in RAM, so one second of power fluctuation and hours of work irretrievably lost.

This machine, with 48 kB of RAM, cost as much as a cheap ChromeBook new today. No monitor: you used a TV set, so the picture was fuzzy and unstable. The cassette player cost extra.

And you know what? We all absolutely loved it, and we miss it still today. :-)

Tue, Jun. 25th, 2019, 03:59 pm
The neck-and-neck race between DR-DOS and MS-DOS 3, 5, 6 and beyond.

The evolution of DOS is interesting, and few remember the bigger picture now.

MS did a great deal when supplying DOS to IBM; MS retained the rights to sell it itself to other manufacturers.

So in the early days, there were other MS-DOS machines that weren't IBM compatible, such as the Apricot, Victor and Sirius.

But soon it became apparent that IBM compatibility was key. Compaq reverse-engineered the IBM BIOS and built the first clones, and the PC industry started from there.

PC DOS only came with IBM kit. MS-DOS came with everything else, but only with the computer. You couldn't buy it directly.

Excluding bugfixes, it went like this:

DOS 1: floppy-only machines.
DOS 2: added hard disk support (a single one) and subdirectories.
DOS 3: added support for 2 hard disks and networking. Then in a point release, support for 2 partitions per disk. Then in another point release, multiple "logical drives" in a single extended partition, so you could use all the space on a big drive... but still a max of 32 MB.

Other companies started tweaking their version of MS-DOS 3.3 to allow bigger than 32MB drives. The method used in Compaq DOS 3.31 is the one IBM and MS picked and it was used in DOS 4.

MS had a project to do a multitasking DOS 4 so didn't work on DOS 3.3 for ages. IBM did its own thing, and added big disk support, code page switching for international character sets, and a slightly clunky graphical launcher called DOSShell.

MS reluctantly released this as MS-DOS 4. It's the first release that required a bugfix fairly quickly. The multitasking version got abandoned: big disk support was needed more urgently. But DOS 4 had other gotchas -- such as using a lot more RAM so some apps couldn't run. (Everything in DOS had to fit into the first 640 kB).

DR noticed this. Its CP/M-86 was late, expensive and so lost out to MS, even thought it was the inspiration for SCP’s QDOS, the basis of DOS 1.0. DR had its own line of multitasking CP/M derivatives, for minicomputer like x86 machines with terminals: Concurrent CP/M, and later with DOS app compatibility, Concurrent DOS. It also had its own standalone single-user DOS, DOS Plus, which could run 3 background tasks on a single PC (if they all fitted into what was left from 640 kB after the OS loaded!)

So DR reworked DOS Plus, removed anything that broke compatibility, like the multitasking and CP/M app support, updated its MS-DOS compatibility with code from Concurrent DOS, and released it as DR-DOS. It bumped the version number from the last small, memory-efficient MS-DOS, MS-DOS 3.3, but included compatible large-disk support. So… DR-DOS 3.41.

It only offered it through OEMs at first. You couldn’t buy it at retail. But it proved moderately popular, a sort of cult hit. People heard about it. (This is all in the 1980s so pre-WWW.) People asked to buy it as an upgrade.


So DR had a great idea. There were already 3rd party memory managers for DOS on 386 computers, which let you map RAM into bits of the space between 640 kB and 1024 kB. You couldn’t run bigger apps using this space because it wasn’t contiguous with base memory, but you could load bits of DOS into them: keyboard drivers, CD drivers, mouse drivers, disk caches. Now, instead of having only 500-550 kB of 640 free for your apps after loading all your drivers, you got more room: up to 580-590 kB.

PC/MS-DOS 4 made this even more necessary as it used more memory than DOS 3.3.

DR wrote their own and bundled it into DR-DOS, and leapfrogged MS-DOS 4 by calling it DR-DOS 5. You could even move DOS itself out of the base memory, and have 620-630 kB free, without 3rd party tools. It was amazing. It also added a full-screen text editor, which incredibly MS-DOS still didn’t have.

And in a masterstroke, they made it available at retail. You could buy it in a shop and upgrade your PC or MS-DOS computer.

It sold extremely well and that made MS angry. It had never realised there was a potential retail market in after-market DOS upgrades or additional DOS features; it had been distracted by the success of Windows 3.

So MS copied the features of DR-DOS 5 and, playing catchup, made MS-DOS 5. All the features of MS-DOS 4, more free memory than ever with a memory manager, a full-screen editor (actually part of QBASIC, which was the GW-BASIC interpreter with the IDE from the QuickBASIC compiler.)

And sold it as a retail upgrade.

It did way better than DR-DOS 5 because it had Microsoft’s marketing muscle.

Novell bought DR around this time, intending to go against MS with a multi-pronged strategy: a better DOS, some best-of-breed apps - it also bought WordPerfect, now failing against Windows apps, notably Word of Windows and a Windows port of the Mac’s Excel spreadsheet. To rival Excel it bought Quattro Pro from Borland, a graphical spreadsheet for DOS.

Against Windows itself, Novell planned a Linux-based desktop, codenamed “Corsair”, which eventually became Caldera OpenLinux.

Novell bundled SuperStor disk compression, and re-implemented DOS Plus’ multitasking with TASKMAX.

Result, DR-DOS 6, AKA Novell DOS 6.

Microsoft responded with MS-DOS 6, still playing catchup. It added built-in antivirus and built-in backup, licensed in from other companies who never made the promised monies from selling enhanced versions. It also added disk compression. MS looked at licensing in disk compression from the #1 3rd party vendor, STAC, authors of Stacker. It got to see the code. In the end it didn’t go with Stacker but licensed Vertisoft DoubleDisk instead — presumably because it was cheaper. But it used some Stacker code in DoubleSpace.

STAC sued, won, and spend the money on moving out of the drive-compression market, knowing that drive sizes would grow and make its product irrelevant. It bought the ReachOut remote-control tool, and a server backup tool, and tried to rebrand as a server maintenance tools vendor, foreseeing the rise of internet-based remote admin — but too soon.

The result was MS-DOS 6.1, with no disk compression, while MS rewrote it to remove the stolen code.

Then MS-DOS 6.2, with DriveSpace instead of DoubleSpace, and the SCANDISK improved disk-repair tool.

Then MS-DOS 6.21 and 6.22, bug fixes.

Needless to say, Vertisoft made no money from add-on DriveSpace tools, and Central Point made no money from updates to DOS Antivirus or the bundled PC Backup. Both went under.

Novell responded with DR-DOS 7, with bundled peer-to-peer networking. MS didn’t bother as Windows for Workgroups already included that.

Then MS moved the goalposts with Windows 95, which actually bundled MS-DOS into Windows.

Novell did get Win95 running on top of DR-DOS, but there was no point and it wisely decided not to sell it. Once you had Win95, what DOS did underneath became rather irrelevant, memory management and all.

Novell gave up on the DOS line.

However, the Linux it sponsored did quite well. Caldera was the first desktop Linux I used as my main OS for a while. It had a great setup tool, LISA. It had the first graphical installer. It was the first distro to bundle the new KDE graphical desktop.

It was streets ahead of Red Hat or Debian at the time, let alone Slackware.

So Novell bought the Unix business off AT&T, and SCO, the leading PC UNIX vendor, and tried to get Caldera to integrate these 3 disparate products into a whole and a market.

It didn’t work but that’s a whole other story. What’s relevant to DOS is that Caldera spun off its DOS division as Lineo (who offered me a job once, as a leading DOS expert! But I didn’t want to move to Utah, partly because I like beer, partly because I’m atheist and thought it wouldn’t be too comfortable to live in the Mormon state.)

Lineo tried to make a business out of DR-DOS as a thin client OS. It didn’t work. But Lineo inherited what was left of Digital Research. The Concurrent DOS business had been sold off to 2 of its leading resellers, and that’s just barely still around, amazingly. The realtime OS FlexOS and multitasking X/GEM desktop had been sold off and was sold by IBM until recently, and now by Toshiba.

But the other DR properties — CP/M and the GEM desktop for DOS — Lineo made open source, and both are still around today.

Meanwhile, MS lost interest in DOS as it pursued Windows 95 OSR2, Windows 98 and Windows ME. Indeed the embedded DOS in NT has never moved beyond version 5.5. But IBM co-owns DOS, and it did not lose interest. It continued to develop it for years, including the new features from the embedded MS-DOS within Win9x. The result was IBM PC DOS 7, then PC DOS 2000 (briefly bundled with VirtualPC!) and finally IBM PC DOS 7.1. IBM eschewed MS's editor and BASIC, replacing them with a version of its own OS/2 and mainframe editor E, and replacing QBASIC with REXX. It's an interesting OS.

That is the last ever member of the mighty DOS dynasty. I've blogged about it before. It was never released on its own, but IBM's ServerGuide Scripting Toolkit is a free download and includes the kernel and utilities of PC DOS 7.1. You can combine this with the rest of PC DOS 2000 -- reminder, it was in VirtualPC, and VirtualPC was a free download, too -- and build your own complete working copy. I have it booting "on the metal" on a Thinkpad X200 and it's a pleasure to use -- and very, very fast. Free DOS apps such as Microsoft Word 5.5, the AsEasyAs spreadsheet, the WordPerfect Editor and so on all run fine and amazingly fast.

10 most recent