?

Log in

Thu, Jul. 21st, 2016, 07:39 pm
Would anyone like to buy my Blackberry?

I am reluctant, but I have to sell this lovely phone.

It's a 32GB, fully-unlocked Blackberry Passport running the latest OS. It's still in support and receiving updates.

http://us.blackberry.com/smartphones/blackberry-passport/overview.html

The sale includes a PDAir black leather folding case which is included in the price -- one of these:

https://www.amazon.co.uk/Pdair-Leather-BlackBerry-Passport-Stitch/dp/B012AU2FVO

It is used but in excellent condition and fully working. I have used both Tesco Mobile CZ and UK EE micro SIM cards and both worked perfectly.

The keyboard is also a trackpad and can be used to scroll and select text. The screen is square and hi-resolution -- the best I have ever used on a smartphone.

It runs the latest Blackberry 10 OS, which has the best email client on any pocket device. It can also run some Android apps and includes the Amazon app store. I side-loaded the Google Play store but not all apps for standard Android work. I am happy to help you load this if you want.

It is 100% usable without a Google, Apple or Microsoft account, if you are concerned about privacy issues.

It supports Blackberry Messenger, obviously, and has native clients for Twitter and other social networks -- I used Skype, Reddit, Foursquare and Untappd, among others. I also ran Android clients for Runkeeper, Last.FM and several other services. Facebook, Google+ and others are usable via their web interfaces.

I will do a full factory reset before handing it over.

It has a microSD slot for additional storage if you need it.

It is about a year old and has been used, so the battery is not good as new, but it still lasts much longer than the Android phablet that replaced it!

You can see it and try it before purchase if you wish.

Reason for sale: I needed more apps. I do not speak Czech and I need Google Translate and Google Maps almost every day.

Note: no mains adaptor included but it charges over micro-USB, so any charger will work, although it complains about other phone brand's chargers -- but they still work.

IKEA sell a cheap multiport one:
http://www.ikea.com/cz/cs/catalog/products/00291891/



You can see photos of my device here:
Passport

This is the Flickr album, or click on the photo above.

I am hoping for CzK 10000 but I am willing to negotiate.

Contact details on my profile page, or email lproven on Google Mail.

Tue, Jul. 19th, 2016, 07:04 pm
Respinning Linux -- Linux as a tool for bringing Internet access to the socially-disadvantaged

I found this post interesting:

"Respinning Linux"

It led me to comment as follows...

Have you folks encountered LXLE? It's a special version of Lubuntu, the lightest-weight of the official Ubuntu remixes, optimised for older hardware.

http://www.lxle.net/

Cinnamon is a lot less than ideal, because it uses a desktop compositor. This requires hardware OpenGL. If the graphics driver doesn't do this, it emulates it using a thing called "LLVMpipe". This process is slow & uses a lot of CPU bandwidth. This is true of all desktops based on GNOME 3 -- including Unity, Elementary OS, RHEL/CentOS "Gnome Classic", SolusOS's Consort, and more. All are based on Gtk 3.

In KDE, it is possible to disable the compositor, but it's still very heavyweight.

The mainstream desktops that do not need compositing at all are, in order of size (memory footprint), from largest to smallest:
* Maté
* Xfce
* LXDE

All are based on Gtk 2, which has now been replaced with Gtk 3.

Of these, LXDE is the lightest, but it is currently undergoing a merger with the Razor-Qt desktop to become LXQt. This may be larger & slower when finished -- it's too soon to tell.

However, of the 3, this means it has a brighter-looking future because it will be based on a current toolkit. Neither Maté nor Xfce have announced firm migration paths to Gtk 3 yet.

Sun, Jun. 5th, 2016, 03:05 pm
Did the floppy disk, & diskette drives, die before their time?

I almost never saw 2.8MB floppy drives.

I know they were out there. The later IBM PS/2 machines used them, and so did some Unix workstations, but the 2.8MB format -- quad or extended density -- never really took off.

It did seem to me that if the floppy companies & PC makers had actually adopted them wholesale, the floppy disk as a medium might have survived for considerably longer.

The 2.8MB drives never really took off widely, so the media remained expensive, ISTM -- and thus little software was distributed on the format, because few machines could read it.

By 1990 there was an obscure and short-lived 20MB floptical diskette format:

http://www.cbronline.com/news/insites_20mb_floptical_drive_reads_144mb_disks

Then in 1994 came 100MB Zip disks, which for a while were a significant format -- I had Macs with built-in-as-standard Zip drives.

Then the 3½" super floptical drives, the Imation SuperDisk in 1997, 144MB Caleb UHD144 in early 1998 and then 150MB Sony HiFD in late 1998.

(None of these later drives could read 2.8MB diskettes, AFAIK.)

After that, writable CDs got cheap enough to catch on, and USB Flash media mostly has killed them off now.

If the 2.8 had taken off, and maybe even intermediate ~6MB and ~12MB formats -- was that feasible? -- before the 20MB ones, well, with widespread adoption, there wouldn't have been an opening for the Zip drive, and the floppy drive might have remained a significant and important medium for another decade.

I didn't realise that the Zip drive eventually got a 750MB version, presumably competing with Iomega's own 1GB Jaz drive. If floppy drives had got into that territory, could they have even fended off CDs? Rewritable CDs always were a pain. They were a one-shot medium and thus inconvenient and expensive -- write on one machine, use a few times at best, then throw away.

I liked floppies. I enjoy playing with my ancient Sinclair computers, but loading from tape cassette is just a step too far. I remember the speed and convenience when I got my first Spectrum disk drive, and I miss it. Instant loading from an SD drive just isn't the same. I don't use them on PCs any more -- I don't have a machine with a floppy drive in this country -- but for 8-bits, two drives with a meg or so of storage was plenty. I used them long after most people, if only for updating BIOSes and so on.

Sun, Jun. 5th, 2016, 02:34 pm
The rise & fall of the first real x86 rival to Intel: the Cyrix 6x86

I was surprised to read someone castigating and condemning the Cyrix line of PC CPUs today.

For a while, I recommended 'em and used 'em myself. My own home PC was a Cyrix 6x86 P166+ for a year or two. Lovely machine -- a 133MHz processor that performed about 30-40% better than an Intel Pentium MMX at the same clock speed.

My then-employer, PC Pro magazine, recommended them too.

I only ever hit one problem: I had to turn down reviewing the latest version of Aldus PageMaker because it wouldn't run on a 6x86. I replaced it with a Baby-AT Slot A Gigabyte motherboard and a Pentium II 450. (Only the 100MHz front side bus Pentium IIs were worth bothering with IMHO. The 66MHz FSB PIIs could be outperformed by a cheaper SuperSocket 7 machine with a Cyrix chip.) It was very difficult to find a Baby-AT motherboard for a PII -- the market had switched to ATX by then -- but it allowed me to keep a case I particularly liked, and indeed, most of the components in that case, too.

The one single product that killed the Cyrix chips was id Software's Quake.

Quake used very cleverly optimised x86 code that interleaved FPU and integer instructions, as John Carmack had worked out that apart from instruction loading, which used the same registers, FPU and integer operations used different parts of the Pentium core and could effectively be overlapped. This nearly doubled the speed of FPU-intensive parts of the game's code.

The interleaving didn't work on Cyrix cores. It ran fine, but the operations did not overlap, so execution speed halved.

On every other benchmark and performance test we could devise, the 6x86 core was about 30-40% faster than the Intel Pentium core -- or the Pentium MMX, as nothing much used the extra instructions, so really only the additional L1 cache helped. (The Pentium 1 had 16 kB of L1; the Pentium MMX had 32 kB.)

But Quake was extremely popular, and everyone used it in their performance tests -- and thus hammered the Cyrix chips, even though the Cyrix was faster in ordinary use, in business/work/Windows operation, indeed in every other game except Quake.

And ultimately that killed Cyrix off. Shame, because the company had made some real improvements to the x86-32 design. Improving instructions-per-clock is more important than improving the raw clock speed, which was Intel's focus right up until the demise of the Netburst Pentium 4 line.

AMD with the 64-bit Sledgehammer core (Athlon 64 & Opteron) did the same to the P4 as Cyrix's 6x86 did to the Pentium 1. Indeed I have a vague memory some former Cyrix processor designers were involved.

Intel Israel came back with the (Pentium Pro-based) Pentium M line, intended for notebooks, and that led to the Core series, with IPC speeds that ultimately beat even AMD's. Today, nobody can touch Intel's high-end x86 CPUs. AMD is looking increasingly doomed, at least in that space. Sadly, though, Intel has surrendered the low end and is killing the Atom line.

http://www.pcworld.com/article/3063672/windows/the-death-of-intels-atom-casts-a-dark-shadow-over-the-rumored-surface-phone.html

The Atoms were always a bit gutless, but they were cheap, ran cool, and were frugal with power. In recent years they've enabled some interesting cheap low-end Windows 8 and Windows 10 tablets:

http://www.anandtech.com/show/8760/hp-stream-7-review

https://www.amazon.co.uk/Windows10-Tablet-Display-11000mAh-Battery-F-Black-B-Gray/dp/B01DF3UV3Y?ie=UTF8&keywords=hi12&qid=1460578088&ref_=sr_1_2&sr=8-2

Given that there is Android for x86, and have already been Intel-powered Android phones, plus Windows 10 for phones today, this opened up the intriguing possibility of x86 Windows smartphones -- but then Intel slammed the door shut.

Cyrix still exists, but only as a brand for Via, with some very low-end x86 chips. Interestingly, these don't use Cyrix CPU cores -- they use a design taken from a different non-Intel x86 vendor, the IDT WinChip:

https://en.wikipedia.org/wiki/WinChip

I installed a few WinChips as upgrades for low-speed Pentium PCs. The WinChip never was all that fast, but it was a very simple, stripped-down core, so it ran cool, was about as quick as a real Pentium core, but was cheaper and ran at higher clock speeds, so they were mainly sold as an aftermarket upgrade for tired old PCs. The Cyrix chips weren't a good fit for this, as they required different clock speeds, BIOS support, additional cooling and so on. IDT spotted a niche and exploited it, and oddly, that is the non-Intel x86 core that's survived at the low-end, and not the superior 6x86 one.

In the unlikely event that Via does some R&D work, it could potentially move into the space now vacated by the very low-power Atom chips. AMD is already strong in the low-end x86 desktop/notebook space with its Fusion processors which combine a 64-bit x86 core with an ATI-derived GPU, but they are too big, too hot-running and too power-hungry for smartphones or tablets.

Fri, May. 13th, 2016, 01:50 pm
The decline & fall of DEC - & MS

(Repurposed email reply)

Although I was educated & worked with DEC systems, I didn't have much to do with the company itself. Its support was good, the kit ludicrously expensive, and the software offerings expensive, slow and lacking competitive features. However, they also scored in some ways.

My 60,000' view:

Microsoft knew EXACTLY what it was doing with its practices when it built up its monopoly. It got lucky with the technology: its planned future super products flopped, but it turned on a dime & used what worked.

But killing its rivals, any potential rival? Entirely intentional.

The thing is that no other company was poised to effectively counter the MS strategy. Nobody.

MS' almost-entirely-software-only model was almost unique. Its ecosystem of apps and 3rd party support was unique.

In the end, it actually did us good. Gates wanted a computer on every desk. We got that.

The company's strategy called for open compatible generic hardware. We got that.

Only one platform, one OS, was big enough, diverse enough, to compete: Unix.

But commercial, closed, proprietary Unix couldn't. 2 ingredients were needed:

#1 COTS hardware - which MS fostered;
#2 FOSS software.

Your point about companies sharing their source is noble, but I think inadequate. The only thing that could compete with a monolithic software monopolist on open hardware was open software.

MS created the conditions for its own doom.

Apple cleverly leveraged FOSS Unix and COTS X86 hardware to take the Mac brand and platform forward.

Nobody else did, and they all died as a result.

If Commodore, Atari and Acorn had adopted similar strategies (as happened independently of them later, after their death, resulting in AROS, AFROS & RISC OS Open), they might have lived.

I can't see it fitting the DEC model, but I don't know enough. Yes, cheap low-end PDP-11s with FOSS OSes might have kept them going longer, but not saved them.

The deal with Compaq was catastrophic. Compaq was in Microsoft's pocket. I suspect that Intel leant on Microsoft and Microsoft then leant on Compaq to axe Alpha, and Compaq obliged. It also knifed HP OpenMail, possibly the Unix world's only viable rival to Microsoft Exchange.

After that it was all over bar the shouting.

Microsoft could not have made a success of OS/2 3 without Dave Cutler... But DEC couldn't have made a success out of PRISM either, I suspect. Maybe a stronger DEC would have meant Windows NT would never have happened.

Wed, Apr. 27th, 2016, 07:06 pm
Where did we all go wrong? And why doesn't anyone remember? [Tech blog post]

My contention is that a large part of the reason that we have the crappy computers that we do today -- lowest-common-denominator boxes, mostly powered by one of the kludgiest and most inelegant CPU architectures of the last 40 years -- is not technical, nor even primarily commercial or due to business pressures, but rather, it's cultural.

When I was playing with home micros (mainly Sinclair and Amstrad; the American stuff was just too expensive for Brits in the early-to-mid 1980s), the culture was that Real Men programmed in assembler and the main battle was Z80 versus 6502, with a few weirdos saying that 6809 was better than either. BASIC was the language for beginners, and a few weirdos maintained that Forth was better.

At university, I used a VAXcluster and learned to program in Fortran-77. The labs had Acorn BBC Micros in -- solid machines, the best 8-bit BASIC ever, and they could interface both with lab equipment over IEEE-488 and with generic printers and so on over Centronics parallel and its RS-423 interface [EDIT: fixed!], which could talk to RS-232 kit.

As I discovered when I moved into the professional field a few years later (1988), this wasn't that different from the pro stuff. A lot of apps were written in various BASICs, and in the old era of proprietary OSes on proprietary kit, for performance, you used assembler.

But a new wave was coming. MS-DOS was already huge and the Mac was growing strongly. Windows was on v2 and was a toy, but Unix was coming to mainstream kit, or at least affordable kit. You could run Unix on PCs (e.g. SCO Xenix), on Macs (A/UX), and my employers had a demo IBM RT-6150 running AIX 1.

Unix wasn't only the domain (pun intentional) of expensive kit priced in the tens of thousands.

A new belief started to spread: that if you used C, you could get near-assembler performance without the pain, and the code could be ported between machines. DOS and Mac apps started to be written (or rewritten) in C, and some were even ported to Xenix. In my world, nobody used stuff like A/UX or AIX, and Xenix was specialised. I was aware of Coherent as the only "affordable" Unix, but I never saw a copy or saw it running.

So this second culture of C code running on non-Unix OSes appeared. Then the OSes started to scramble to catch up with Unix -- first OS/2, then Windows 3, then the for a decade parallel universe of Windows NT, until XP became established and Win9x finally died. Meanwhile, Apple and IBM flailed around, until IBM surrendered, Apple merged with NeXT and switched to NeXTstep.

Now, Windows is evolving to be more and more Unix-like, with GUI-less versions, clean(ish) separation between GUI and console apps, a new rich programmable shell, and so on.

While the Mac is now a Unix box, albeit a weird one.

Commercial Unix continues to wither away. OpenVMS might make a modest comeback. IBM mainframes seem to be thriving; every other kind of big iron is now emulated on x86 kit, as far as I can tell. IBM has successfully killed off several efforts to do this for z Series.

So now, it's Unix except for the single remaining mainstream proprietary system: Windows. Unix today means Linux, while the weirdoes use FreeBSD. Everything else seems to be more or less a rounding error.

C always was like carrying water in a sieve, so now, we have multiple C derivatives, trying to patch the holes. C++ has grown up but it's like Ada now: so huge that nobody understands it all, but actually, a fairly usable tool.

There's the kinda-sorta FOSS "safe C++ in a VM", Java. The proprietary kinda-sorta "safe C++ in a VM", C#. There's the not-remotely-safe kinda-sorta C in a web browser, Javascript.

And dozens of others, of course.

Even the safer ones run on a basis of C -- so the lovely cuddly friendly Python, that everyone loves, has weird C printing semantics to mess up the heads of beginners.

Perl has abandoned its base, planned to move onto a VM, then the VM went wrong, and now has a new VM and to general amazement and lack of interest, Perl 6 is finally here.

All the others are still implemented in C, mostly on a Unix base, like Ruby, or on a JVM base, like Clojure and Scala.

So they still have C like holes and there are frequent patches and updates to try to make them able to retain some water for a short time, while the "cyber criminals" make hundreds of millions.

Anything else is "uncommercial" or "not viable for real world use".

Borland totally dropped the ball and lost a nice little earner in Delphi, but it continues as Free Pascal and so on.

Apple goes its own way, but has forgotten the truly innovative projects it had pre-NeXT, such as Dylan.

There were real projects that were actually used for real work, like Oberon the OS, written in Oberon the language. Real pioneering work in UIs, such as Jef Raskin's machines, the original Mac and Canon Cat -- forgotten. People rhapsodise over the Amiga and forget that the planned OS, CAOS, to be as radical as the hardware, never made it out of the lab. Same, on a smaller scale, with the Acorn Archimedes.

Despite that, of course, Lisp never went away. People still use it, but they keep their heads down and get on with it.

Much the same applies to Smalltalk. Still there, still in use, still making real money and doing real work, but forgotten all the same.

The Lisp Machines and Smalltalk boxes lost the workstation war. Unix won, and as history is written by the victors, now the alternatives are forgotten or dismissed as weird kooky toys of no serious merit.

The senior Apple people didn't understand the essence of what they saw at PARC: they only saw the chrome. They copied the chrome, not the essence, and now all that any of us have is the chrome. We have GUIs, but on top of the nasty kludgy hacks of C and the like. A late-'60s skunkware project now runs the world, and the real serious research efforts to make something better, both before and after, are forgotten historical footnotes.

Modern computers are a vast disappointment to me. We have no thinking machines. The Fifth Generation, Lisp, all that -- gone.

What did we get instead?

Like dinosaurs, the expensive high-end machines of the '70s and '80s didn't evolve into their successors. They were just replaced. First little cheapo 8-bits, not real or serious at all, although they were cheap and people did serious stuff with them because it's all they could afford. The early 8-bits ran semi-serious OSes such as CP/M, but when their descendants sold a thousand times more, those descendants weren't running descendants of that OS -- no, it and its creator died.

CP/M evolved into a multiuser multitasking 386 OS that could run multiple MS-DOS apps on terminals, but it died.

No, then the cheapo 8-bits thrived in the form of an 8/16-bit hybrid, the 8086 and 8088, and a cheapo knock-off of CP/M.

This got a redesign into something grown-up: OS/2.

Predictably, that died.

So the hacked-together GUI for DOS got re-invigorated with an injection of OS/2 code, as Windows 3. That took over the world.

The rivals - the Amiga, ST, etc? 680x0 chips, lots of flat memory, whizzy graphics and sound? All dead.

Then Windows got re-invented with some OS/2 3 ideas and code, and some from VMS, and we got Windows NT.

But the marketing men got to it and ruined its security and elegance, to produce the lipstick-and-high-heels Windows XP. That version, insecure and flakey with its terrible bodged-in browser, that, of course, was the one that sold.

Linux got nowhere until it copied the XP model. The days of small programs, everything's a text file, etc. -- all forgotten. Nope, lumbering GUI apps, CORBA and RPC and other weird plumbing, huge complex systems, but it looks and works kinda like Windows and a Mac now so it looks like them and people use it.

Android looks kinda like iOS and people use it in their billions. Newton? Forgotten. No, people have Unix in their pocket, only it's a bloated successor of Unix.

The efforts to fix and improve Unix -- Plan 9, Inferno -- forgotten. A proprietary microkernel Unix-like OS for phones -- Blackberry 10, based on QNX -- not Androidy enough, and bombed.

We have less and less choice, made from worse parts on worse foundations -- but it's colourful and shiny and the world loves it.

That makes me despair.

We have poor-quality tools, built on poorly-designed OSes, running on poorly-designed chips. Occasionally, fragments of older better ways, such as functional-programming tools, or Lisp-based development environments, are layered on top of them, but while they're useful in their way, they can't fix the real problems underneath.

Occasionally someone comes along and points this out and shows a better way -- such as Curtis Yarvin's Urbit. Lisp Machines re-imagined for the 21st century, based on top of modern machines. But nobody gets it, and its programmer has some unpleasant and unpalatable ideas, so it's doomed.

And the kids who grew up after C won the battle deride the former glories, the near-forgotten brilliance that we have lost.

And it almost makes me want to cry sometimes.

We should have brilliant machines now, not merely Steve Jobs' "bicycles for the mind", but Gossamer Albatross-style hang-gliders for the mind.

But we don't. We have glorified 8-bits. They multitask semi-reliably, they can handle sound and video and 3D and look pretty. On them, layered over all the rubbish and clutter and bodges and hacks, inspired kids are slowly brute-forcing machines that understand speech, which can see and walk and drive.

But it could have been so much better.

Charles Babbage didn't finish the Difference Engine. It would have paid for him to build his Analytical Engine, and that would have given the Victorian British Empire the steam-driven computer, which would have transformed history.

But he got distracted and didn't deliver.

We started to build what a few old-timers remember as brilliant machines, machines that helped their users to think and to code, with brilliant -- if flawed -- software written in the most sophisticated computer languages yet devised, by the popular acclaim of the people who really know this stuff: Lisp and Smalltalk.

But we didn't pursue them. We replaced them with something cheaper -- with Unix machines, an OS only a nerd could love. And then we replaced the Unix machines with something cheaper still -- the IBM PC, a machine so poor that the £125 ZX Spectrum had better graphics and sound.

And now, we all use descendants of that. Generally acknowledged as one of the poorest, most-compromised machines, based on descendants of one of the poorest, most-compromised CPUs.

Yes, over the 40 years since then, most of rough edges have been polished out. The machines are now small, fast, power-frugal with tons of memory and storage, with great graphics and sound. But it's taken decades to get here.

And the OSes have developed. Now they're feature-rich, fairly friendly, really very robust considering the stone-age stuff they're built from.

But if we hadn't spent 3 or 4 decades making a pig's ear into silk purse -- if we'd started with a silk purse instead -- where might we have got to by now?

Mon, Apr. 25th, 2016, 03:55 pm
Acorn: from niche to forgotten obscurity and total industry dominance at the same time

More retrocomputing meanderings -- whatever became of the ST, Amiga and Acorn operating systems?

The Atari ST's GEM desktop also ran on MS-DOS, DR's own DOS+ (a forerunner of the later DR-DOS) and today is included with FreeDOS. In fact the first time I installed FreeDOS I was *very* surprised to find my name in the credits. I debugged some batch files used in installing the GEM component.

The ST's GEM was the same environment. ST GEM was derived from GEM 1; PC GEM from GEM 2, crippled after an Apple lawsuit. Then they diverged. FreeGEM attempted to merge them again.

But the ST's branch prospered, before the rise of the PC killed off all the alternative platforms. Actual STs can be quite cheap now, or you can even buy a modern clone:

http://harbaum.org/till/mist/index.shtml

If you don't want to lash out but have a PC, the Aranym environment gives you something of the feel of the later versions. It's not exactly an emulator, more a sort of compatibility environment that enhances the "emulated" machine as much as it can using modern PC hardware.

http://aranym.org/

And the ST GEM OS was so modular, different 3rd parties cloned every components, separately. Some commercially, some as FOSS. The Aranym team basically put together a sort of "distribution" of as many FOSS components as they could, to assemble a nearly-complete OS, then wrote the few remaining bits to glue it together into a functional whole.

So, finally, after the death of the ST and its clones, there was an all-FOSS OS for it. It's pretty good, too. It's called AFROS, Atari Free OS, and it's included as part of Aranym.

I longed to see a merger of FreeGEM and Aranym, but it was never to be.

The history of GEM and TOS is complex.

Official Atari TOS+GEM evolved into TOS 4, which included the FOSS Mint multitasking later, which isn't much like the original ROM version of the first STs.

The underlying TOS OS is not quite like anything else.

AIUI, CP/M-68K was a real, if rarely-seen, OS.

However, it proved inadequate to support GEM, so it was discarded. A new kernel was written using some of the tech from what was later to become DR-DOS on the PC -- something less like CP/M and more like MS-DOS: directories, separated with backslashes; FAT format disks; multiple executable types, 8.3 filenames, all that stuff.

None of the command-line elements of CP/M or any DR DOS-like OS were retained -- the kernel booted the GUI directly and there was no command line, like on the Mac.

This is called GEMDOS and AIUI it inherits from both the CP/M-68K heritage and from DR's x86 DOS-compatible OSes.

The PC version of GEM also ran on Acorn's BBC Master 512 which had an Intel 80186 coprocessor. It was a very clever machine, in a limited way.

Acorn's series of machines are not well-known in the US, AFAICT, and that's a shame. They were technically interesting, more so IMHO than the Apple II and III, TRS-80 series etc.

The original Acorns were 6502-based, but with good graphics and sound, a plethora of ports, a clear separation between OS, BASIC and add-on ROMs such as the various DOSes, etc. The BASIC was, I'd argue strongly, *the* best 8-bit BASIC ever: named procedures, local variables, recursion, inline assembler, etc. Also the fastest BASIC interpreter ever, and quicker than some compiled BASICs.

Acorn built for quality, not price; the machines were aimed at the educational market, which wasn't so price-sensitive, a model that NeXT emulated. Home users were welcome to buy them & there was one (unsuccessful) home model, but they were unashamedly expensive and thus uncompromised.

The only conceptual compromise in the original BBC Micro was that there was provision for ROM bank switching, but not RAM. The 64kB memory map was 50:50 split ROM and RAM. You could switch ROMs, or put RAM in their place, but not have more than 64kB. This meant that the high-end machine had only 32kB RAM, and high-res graphics modes could take 21kB or so, leaving little space for code -- unless it was in ROM, of course.

The later BBC+ and BBC Master series fixed that. They also allowed ROM cartridges, rather than bare chips inserted in sockets on the main board, and a numeric keypad.

Acorn looked at the 16-bit machines in the mid-80s, mostly powered by Motorola 68000s of course, and decided they weren't good enough and that the tiny UK company could do better. So it did.

But in the meantime, it kept the 6502-based, resolutely-8-bit BBC Micro line alive with updates and new models, including ROM-based terminals and machines with a range of built-in coprocessors: faster 6502-family chips for power users, Z80s for CP/M, Intel's 80186 for kinda-sorta PC compatibility, the NatSemi 32016 with PANOS for ill-defined scientific computing, and finally, an ARM copro before the new ARM-based machines were ready.

Acorn designed the ARM RISC chip in-house, then launched its own range of ARM-powered machines, with an OS based on the 6502 range's. Although limited, this OS is still around today and can be run natively on a Raspberry Pi:

https://www.riscosopen.org/content/

It's very idiosyncratic -- both the filesystem, the command line and the default editor are totally unlike anything else. The file-listing command is CAT, the directory separator is a full stop (i.e. a period), while the root directory is called $. The editor is a very odd dual-cursor thing. It's fascinating, totally unrelated to the entire DEC/MS-DOS family and to the entire Unix family. There is literally and exactly nothing else even slightly like it.

It was the first GUI OS to implement features that are now universal across GUIs: anti-aliased font rendering, full-window dragging and resizing (as opposed to an outline), and significantly, the first graphical desktop to implement a taskbar, before NeXTstep and long before Windows 95.

It supports USB, can access the Internet and WWW. There are free clients for chat, email, FTP, the WWW etc. and a modest range of free productivity tools, although most things are commercial.

But there's no proper inter-process memory protection, GUI multitasking is cooperative, and consequently it's not amazingly stable in use. It does support pre-emptive multitasking, but via the text editor, bizarrely enough, and only of text-mode apps. There was also a pre-emptive multitasking version of the desktop, but it wasn't very compatible, didn't catch on and is not included in current versions.

But saying all that, it's very interesting, influential, shared-source, entirely usable today, and it runs superbly on the £25 Raspberry Pi, so there is little excuse not to try it. There's also a FOSS emulator which can run the modern freeware version:

http://www.marutan.net/rpcemu/

For users of the old hardware, there's a much more polished commercial emulator for Windows and Mac which has its own, proprietary fork of the OS:

http://www.virtualacorn.co.uk/index2.htm

There's an interesting parallel with the Amiga. Both Acorn and Commodore had ambitious plans for a modern multitasking OS which they both referred to as Unix-like. In both cases, the project didn't deliver and the ground-breaking, industry-redefiningly capable hardware was instead shipped with much less ambitious OSes, both of which nonetheless were widely-loved and both of which still survive in the form of multiple, actively-maintained forks, today, 30 years later -- even though Unix in fact caught up and long surpassed these 1980s oddballs.

AmigaOS, based in part on the academic research OS Tripos, has 3 modern forks: the FOSS AROS, on x86, and the proprietary MorphOS and AmigaOS 4 on PowerPC.

Acorn RISC OS, based in part on Acorn MOS for the 8-bit BBC Micro, has 2 contemporary forks: RISC OS 5, owned by Castle Technology but developed by RISC OS Open, shared source rather than FOSS, running on Raspberry Pi, BeagleBoard and some other ARM boards, plus some old hardware and RPC Emu; and RISC OS 4, now owned by the company behind VirtualAcorn, run by an ARM engineer who apparently made good money selling software ARM emulators for x86 to ARM holdings.

Commodore and the Amiga are both long dead and gone, but the name periodically changes hands and reappears on various bits of modern hardware.

Acorn is also long dead, but its scion ARM Holdings designs the world's most popular series of CPUs, totally dominates the handheld sector, and outsells Intel, AMD & all other x86 vendors put together something like tenfold.

Funny how things turn out.

Tue, Apr. 5th, 2016, 02:57 pm
The AmigaOS lives on! It's up to 4.1 now. But is there any point today?

I am told it's lovely to use. Sadly, it only runs on obscure PowerPC-based kit that costs a couple of thousand pounds and can be out-performed by
a £300 PC.

AmigaOS's owners -- Hyperion, I believe -- chose the wrong platform.

On a Raspberry Pi or something, it would be great. On obscure expensive PowerPC kit, no.

Also, saying that, I got my first Amiga in the early 2000s. If I'd had one 15y earlier, I'd probably have loved it, but I bought a 2nd hand
Archimedes instead (and still think it was the right choice for a non-gamer and dabbler in programming).

A few years ago, with a LOT of work using 3 OSes and 3rd-party disk-management tools, I managed to coax MorphOS onto my Mac mini G4.
Dear hypothetical gods, that was a hard install.

It's... well, I mean, it's fairly fast, but... no Wifi? No Bluetooth?

And the desktop. It got hit hard with the ugly stick. I mean, OK, it's not as bad as KDE, but... ick.

Learning AmigaOS when you already know more modern OSes -- OS X, Linux, gods help us, even Windows -- well, the Amiga seems pretty
weird, and often for no good reason. E.g. a graphical file manager, but not all files have icons. They're not hidden, they just don't have
icons, so if you want to see them, you have to do a second show-all operation. And the dependence on RAMdisks, which are a historical curiosity now. And the needing to right-click to show the menu-bar when it's on a screen edge.

A lot of pointless arcana, just so Apple didn't sue, AFAICT.

I understand the love if one loved it back then. But now? Yeeeeeeaaaaaah, not so much.

Not that I'm proclaiming RISC OS to be the business now. I like it, but it's weird too. But AmigaOS does seem a bit primitive now. OTOH, if they sorted out multiprocessor support and memory protection and it ran on cheap ARM kit, then yeah, I'd be interested.

Wed, Mar. 30th, 2016, 08:33 pm
This will get me accused of fanboyism (again), but like it or not, Apple shaped the PC industry.

I recently read that a friend of mine claimed that "Both the iPhone and iPod were copied from other manufacturers, to a large extent."

This is a risible claim, AFAICS.

There were pocket MP3 jukeboxes before the iPod. I still own one. They were fairly tragic efforts.

There were smartphones before the iPhone. I still have at least one of them, too. Again, really tragic from a human-computer interaction point of view.


AIUI, the iPhone originated internally as a shrunk-down tablet. The tablet originated from a personal comment from Bill Gates to Steve Jobs that although tablets were a great idea, people simply didn’t want tablets because Microsoft had made them and they didn’t sell.
Read more...Collapse )
Jobs’ response was that the Microsoft ones didn’t sell because they were no good, not because people didn’t want tablets. In particular, Jobs stated that using a stylus was a bad idea. (This is also a pointer was to why he cancelled the Newton. And guess what? I've got one of them, too.)

Gates, naturally, contested this, and Jobs started an internal project to prove him wrong: a stylus-free finger-operated slim light tablet. However, when it was getting to prototype form, he allegedly realised, with remarkable prescience, that the market wasn’t ready yet, and that people needed a first step — a smaller, lighter, simpler, pocketable device, based on the finger-operated tablet.

Looking for a role or function for such a device, the company came up with the idea of a smartphone.

Smartphones certainly existed, but they were a geek toy, nothing more.

Apple was bold enough to make a move that would kill its most profitable line — the iPod — with a new product. Few would be so bold.

I can’t think of any other company that would have been bold enough to invent the iPhone. We might have got to devices as capable as modern smartphones and tablets, but I suspect they’d have still been festooned in buttons and a lot clumsier to use.

It’s the GUI story again. Xerox sponsored the invention and original development but didn’t know WTF to do with it. Contrary to the popular history, it did productise it, but as a vastly expensive specialist tool. It took Apple to make it the standard method of HCI, and it took Apple two goes and many years. The Lisa was still too fancy and expensive, and the original Mac too cut-down and too small and compromised.

The many rivals’ efforts were, in hindsight, almost embarrassingly bad. IBM’s TopView was a pioneering GUI and it was rubbish. Windows 1 and 2 were rubbish. OS/2 1.x was rubbish, and to be honest, OS/2 2.x was the pre-iPhone smartphone of GUI OSes: very capable, but horribly complex and fiddly.

Actually, arguably — and demonstrably, from the Atari ST market — DR GEM was a far better GUI than Windows 1 or 2. GEM was a rip-off of the Mac; the PC version got sued and crippled as a result, so blatant was it. It took MS over a decade to learn from the Mac (and GEM) and produce the first version of Windows with a GUI good enough to rival the Mac’s, while being different enough not to get sued: Windows 95.

Now, 2 decades later, everyone’s GUI borrows from Win95. Linux is still struggling to move on from Win95-like desktops, and even Mac OS X, based on a product which inspired Win95, borrows some elements from the Win95 GUI.

Everyone copies MS, and MS copies Apple. Apple takes bleeding-edge tech and turns geek toys into products that the masses actually want to buy.

Microsoft’s success is founded on the IBM PC, and that was IBM’s response to the Apple ][.

Apple has been doing this consistently for about 40 years. It often takes it 2 or 3 goes, but it does.

  • First time: 8-bit home micros (the Apple ][, an improved version of a DIY kit.)

  • Second time: GUIs (first the Lisa, then the Mac).

  • Third time: USB (on the iMac, arguably the first general-purpose PC designed and sold for Internet access as its primary function).

  • Fourth time: digital music players (the iPod wasn’t even the first with a hard disk).

  • Fifth time: desktop Unix (OS X, based on NeXTstep).

  • Sixth time: smartphones (based on what became the iPad, remember).

  • Seventh time: tablets (the iPad, actually progenitor of the iPhone rather than the other way round).

Yes, there are too many Mac fans, and they’re often under-informed. But there are also far to many Microsoft apologists, and too many Linux ones, too.

I use an Apple desktop, partly because with a desktop, I can choose my own keyboard and pointing device. I hate modern Apple ones.

I don’t use Apple laptops or phones. I’ve owned multiple examples of both. I prefer the rivals.

My whole career has been largely propelled by Microsoft products. I still use some, although my laptops run Linux, which I much prefer.

I am not a fanboy of any of them, but sadly, anyone who expresses fondness or admiration for anything Apple will be inevitably branded as one by the Anti-Apple fanboys, whose ardent advocacy is just as strong and just as irrational.

As will this.

Mon, Mar. 21st, 2016, 02:57 pm
Confessions of a Sinclair fan

I'm very fond of Spectrums (Spectra?) because they're the first computer I owned. I'd used my uncle's ZX-81, and one belonging to a neighbour, and Commodore PETs at school, but the PET was vastly too expensive and the ZX-81 too limited to be of great interest to me.

I read an article once that praised Apple for bringing home computers to the masses with the Apple ][, the first home computer for under US$ 1000. A thousand bucks? That was fantasy winning-the-football-pools money!

No, for me, the hero of the home computer revolution was Sir Clive Sinclair, for bringing us the first home computer for under GB £100. A hundred quid was achievable. A thousand would have gone on a newer car or a family holiday.
Read more...Collapse )

10 most recent