Log in

Tue, Oct. 27th, 2015, 02:38 pm
A Brief History of Virtualisation

As far as I can recall, I didn't plug this at the time, as this series of 5 articles for the Register was later collated into my first-ever book - a Kindle ebook of the same title: http://bit.ly/trabhov

However, that was about 4 years ago now and as it's one of the few times in my tech career that I have accurately predicted a future technology trend -- i.e., containers -- I think it's time.

You can buy the ebook here if you would like to support my work -- no, I don't get royalties, but it will endear me to the Reg:


Or, if you're a cheapskate and just want to read the content for free, then here are the component articles:






(Part 3 is the one about containers)

Enjoy. Buy a copy for all your friend, it's the ideal holiday gift!

Wed, Sep. 23rd, 2015, 04:34 pm
BASIC: good times, bad times, how to lose big

(The title is a parody of http://www.dreamsongs.com/WIB.html )

Even today, people still rail against the horrors of BASIC, as per Edsger Dijkstra's famous comment about it brain-damaging beginner programmers beyond any hope of redemption:


I rather feel that this is due to perceptions of some of the really crap early 8-bit BASICs, and wouldn't have applied if students learned, say, BBC BASIC or one of the other better dialects.

For example, Commodore's pathetically-limited BASIC as supplied on the most successful home computer ever, the Commodore 64, in 1982. Despite its horrors, it's remembered fondly by many. There's even a modern FOSS re-implementation of it!


I've long been puzzled as to exactly why the Commodore 64 shipped with such a terrible, limited, primitive BASIC in its ROM: CBM BASIC 2.0, essentially the 6502 version of Microsoft's MS-BASIC. It wasn't done for space reasons -- the original Microsoft BASIC fitted into 4kB of ROM and a later version into 8kB:


Acorn's BBC BASIC (first released a year earlier, in 1981) was a vastly better dialect.

AFAIK all the ROMable versions of BBC BASIC (BASIC I to BASIC 4.62) fitted into a 16kB ROM, so in terms of space, it was doable.


IOW, CBM had enough room; the C64 kernal+BASIC were essentially those of the original PET, and fitted into an 8kB ROM, I think. And the C64 shipped after the B and P series machines, the CBM-II. OK, CBM BASIC 4 wasn’t much of an improvement, but it was better.

Looking back years later, and reading stuff like Cameron Kaiser’s “Secret Weapons of Commodore” site:


… it seems to me that Commodore management never really had much of an idea of what they were doing. Unlike companies such as Sinclair or Acorn, labouring for years over tiny numbers of finely-honed models, in the 8-bit era, Commodore had multiple teams designing dozens of models of all sorts of kit, often conflicting with one another, and just occasionally chose to ship certain products and kill others — sometimes early, sometimes when it was nearly ready and the packaging was being designed.

(Apple was similar, but at a smaller scale — e.g. the Apple /// competing with the later Apple ][ machines, and the Mac competing with the Lisa, and then the Apple ][GS competing with the Mac.)

There were lovely devices that might have thrived, such as the C65, which were killed.

There were weird, mostly inexplicable hacked-together things, such as the C128, a bastard of a C64, plus a slightly-upgraded C64, plus, of all things, a CP/M micro based around an entirely different an totally incompatible processor, so the C128 had two: a 6502 derivative and a Z80. Bizarre.

There were determined efforts to enhance product lines whose times were past, such as the CBM-II machines, an enhanced PET when the IBM PC was already taking over.

There were odd half-assed efforts to fix problems with released products, such as the C16 and Plus-4, which clearly showed that management didn’t understand their own successes: the C64 was an wildly-successful upgrade of the popular VIC-20, but rather than learn from that and do it again, Commodore did something totally different and incompatible, launched with some fanfare, and appeared mystified that it bombed.

It’s a very strange story of a very schizophrenic company.

And of course, rather than develop their own successor for the 16-bit era, they bought it in — the Lorraine, later the Amiga, a spiritual successor to the Atari 8-bit machines, which themselves were inspired kit for their time.

This leaving Atari in the lurch, but to which the company responded in an inspired way with the ST: an clever mixture of off-the-shelf parts -- PC-type where that was good enough (e.g. graphics controller), or from the previous generation of 8-bits (e.g. sound chip), plus a bought-in adapted OS (Digital Research's GEMDOS plus GEM, never crippled like the PC version was due to Apple's lawsuit, meaning PC disk formats and file compatibility. And of course the brilliant inclusion of MIDI ports, foreseeing an entire industry that was around the corner.

The ST is what the Sinclair QL should have been: a cheap, affordable, usable 16-bit computer. Whereas the poor doomed QL was Sinclair doing its trademark thing too far: a 16-bit machine cut down to the point that it was no better than a decent 8-bit machine.

Interesting times.

Whereas now, almost all the diversity is gone. Today, we just have generic x86 boxes and occasional weird little ARM things, and apart from some research or hobbyist toys, just 2 OS families -- Windows NT or some flavour of Unix.

Sat, Sep. 5th, 2015, 02:20 pm
Containerising Linux desktop apps -- how and why

There are moves afoot to implement desktop apps inside containers on Linux -- e.g.


This is connected with the current uptake of Docker. There seems to be a lot of misunderstanding about Docker, exemplified by a mailing list post I just read which proposes running different apps in different user accounts instead and accessing them via VNC. This is an adaptation of my reply.

Corrections welcomed!

Docker is a kind of standardized container for Linux.

Containers are a sort of virtual machine.

Current VMs are PC emulators for the PC: they virtualise the PC's hardware, so you can run multiple OSes at once on one PC.

This is useful if you want to run, say, 3 different Linux distros, Windows and Solaris on the same machine at once.

If you run lots of copies of the same OS, it is very inefficient, as you duplicate lots of code.

Containers virtualise the OS instead of the computer. 1 OS instance, 1 kernel, but to the apps running on that OS, each app has its own OS. Apps cannot see other apps at all. The virtualisation means that each app thinks it is running standalone on the OS, with nothing else installed.

This means that you can, say, run 200 instances of Apache on 1 instance of Linux, and they are all isolated. If one crashes, the others don't. You can mix versions, have custom modules in one that the others don't have, etc.

All without the overhead of running 200 copies of the OS.

Containerising apps is a security measure. It means that if, say, you have a compromised version of LibreOffice that contains an exploit allowing an attacker to get root, they get root in the container, and as far as they can see, the copy of LibreOffice is the only thing on the computer. No browser, no email, no stored passwords, nothing.

All within 1 user account, so that this can be done for multiple users, side-by-side, even concurrently on a multiuser host.

It is nothing to do with user accounts; these are irrelevant to it.

Gobo's approach to bundling apps mainly just brings benefits to the user: an easier-to-understand filesystem hierarchy, and apps that are self-contained not spread out all over the filesystem. Nice, but not a killer advantage. There's no big technical advantage and it breaks lots of things, which is why Gobo needs the gobohide kernel extension and so on. It's also why Gobo has not really caught on.

But now, containers are becoming popular on servers. It's relatively easy to isolate server apps: they have no GUI and often don't interact much with other apps on the server.

Desktop apps are much harder to containerise. However, containerising them brings lots of other advantages -- it could effectively eliminate the differences between Linux distributions, forever ending the APT-vs-RPM wars by making the packaging irrelevant, while delivering much improved security, granularity, simplicity and more.

In theory all Gobo's benefits at the app level (the OS underneath is the same old mess) plus many more.

It looks like it might be something that will happen. It will have some side-effects -- reducing the ease of interapp communication, for instance. It might break sound mixing, or inter-app copy-and-paste, system browser/email/calender integration and some other things.

And systems will need a lot more hard disk space.

But possibly worth it overall.

One snag at present is that current efforts look to require btrfs, and btrfs is neither mature nor popular at the moment. This might mean that we get new filesystems with the features such sandboxing would need -- maybe there'll be a new ext5 FS, or maybe Bcachefs will fit the bill. It's early days, but the promise looks good.

Sat, Aug. 22nd, 2015, 06:01 pm
Remember, the world’s richest man got that way by selling people stuff they could have had for free.

I was just prodded by someone when suggesting that some friends try Linux. I forgot to mention that you can try it without risking your existing PC setup. It prompted me to write this...

I forget that non-techies don't _know_ stuff like that.

Download a program called VirtualBox. It's free and it lets you run a whole other operating system - e.g. Linux - under Windows as a program. So you can try it out without affecting your real computer.


If all you know is Windows, I'd suggest Linux Mint: http://www.linuxmint.com/

It has a desktop that looks and works similarly to Windows' classic pre-Win8 look & feel.

Google for the steps but here's the basic instructions:

[1] Download and install VirtualBox

[2] Then download the Virtualbox Extensions from the same site. Double-click the extensions file to install it into Vbox. (They have to do it this way for copyright reasons.)

[3] Download Mint. It comes as an ISO file, an image of a DVD.

[4] Make a new VM in VBox. Give it 2-3 gig of RAM. Enable display 3D acceleration in the settings. (Remember, anything you don't know how to do, Google it.) Leave all the other settings as they are.

[5] Start your new VM. It will ask for an ISO file. Point it at the ISO file of Mint you downloaded.

[6] It will boot and run. Install it onto the virtual hard disk inside Vbox. Just accept all the defaults.

[7] Reboot your new Mint VM.

[8] Install the Vbox Guess Additions. On the VBox Device menu, choose “Insert Guest Additions ISO”. Google for instruction on how to install them.

[9] When it’s finished, reboot the VM.

[10] Update your new copy of Linux Mint. (Remember, Google for instructions.)

That’s it. Play with it. See if you can do the stuff you normally do on Windows all right. If you can’t, Google for what program to use and how to install it. It’s not as quick as a real PC but it works.

Don’t assume that because you know how to do something on Windows, it works that way on Linux. E.g. you never should download programs from a website and install them into Linux — it has a better way. Be prepared to learn some stuff.

If you can work it, then you can install it on your PC alongside Windows. This is called Dual Booting. It’s quite easy really and then you choose whether you want Windows or Linux when you turn it on.

All my PCs do it, but I use Windows about once or twice a year, when I absolutely need it. Which is almost never. I only use Windows if someone is paying me too — it is a massive pain to maintain and keep running properly compared to more grown-up equivalents. (Linux and Mac OS X are based on a late-1960s project; they are very mature and polished. The first version of the current Windows family is from 1993. It’s still got a lot of growing up to do — it’s only half the age.)

It’s genuinely better. No, you don’t get all the Windows programs. There aren’t many games for it, for instance. But it can do anything Windows can do, it’s faster, it’s immune to all the Windows viruses and nasties so you don’t need antivirus or a firewall or anything. That means it’s faster, too — antivirus slows computers down, but you need it on Windows.

All the apps are free. All the updates are free, forever. There are thousands of people on web fora who will help you if you have problems, you just have to ask. It’s educational — you will learn more about computers from learning a different way to use them, but that means you won’t be so helpless. You don’t need to be a white-coated genius scientist, but what it means is you take control back from some faceless corporation. Remember, the world’s richest man got that way by selling people stuff they could have had for free if they just knew how.

Thu, Jun. 11th, 2015, 01:12 pm
The ultimate user interface (Blog post by yours truly)

I have ruffled many feathers with my position that the touch-driven computing sector is growing so fast that it's going to subsume the old WIMP model completely. I don't mean that iPads will replace Windows PCs, but that the descendants of the PC will look and act more like tablets than today's desktops and laptops.

But where is it leading, beyond that point? I have absolutely no concrete idea. But the end point? I've read one brilliant model.

It's in one of the later Foundation books by Isaac Asimov, IIRC. (Not a series I'm that enamoured of, actually.)

A guy gets (steals?) a space yacht: a small, 1-man starship. (Set aside the plausibility of this.)

He searches the ship's crew quarters. In its few luxury rooms, there is no cockpit. No controls, no instruments, nothing. He is bemused.

He returns to the comfiest room, the main stateroom, i.e. cabin/bedroom. In it there is a large, bare dressing table with a comfy seat in front of it. He sits.

Two handprints appear, projected on the surface of the desk, shaped in light.

He studies them. They're just hand-shaped spots of light. He puts his hands on them.

And suddenly, he is much smarter. He knows the ship's position and speed in space. He knows where all the nearby planetary bodies are, their gravity wells, the speeds needed to reach them and enter orbit.

Thinking of the greater galaxy, he knows where all the nearby stars are, their masses, their luminosities, their planetary systems. Merely thinking of a planet, he knows its cities, ports, where to orbit it, etc.

All this knowledge is there in his mind if he wants it; if he allows his attention to move elsewhere, it's gone.

He sits back, shocked. His hands lift from the prints on the desk, and it all disappears.

That is the ultimate UI. One you don't know is there.

Any UI where there are metaphors and abstractions and controls you must operate is inferior; direct interaction is better. We've moved from text views of marked-up files with arcane names in folder hierarchies to today: hi-res, full-colour, moving images of fully-formatted documents and images. That's great.

Some people are happily directly manipulating these — drawing and stroking screens with all their fingers, interacting naturally. Push up to see the bottom of a document, tap on items of interest. It's so natural pre-toddlers can do it.

But many old hands still like their pointing hardware and little icons on screen that they can twiddle with their special pointing devices, and they shout angrily that it's more precise and it's tried and tested and it works.

Show them something better, no, it's a toy. OK for idly surfing the web, or reading, or watching movies, but no substitute for the "real thing".

It's a toy and the mere idea that these early versions could in time grow into something that could replace their 4-box Real Computer of System Unit, Monitor, Mouse and Keyboard is a nonsensical piece of idiocy.

Which is exactly what their former bosses and their tutors said about the Mac's UI 30y ago. It's doubtless what they said about the tinker-toy CP/M boxes a decade before that, and so on.

I'm guilty too. I am using a 25y old keyboard on my tiny silent near-unexpandable 2011 Mac mini, attached via a convertor that cost more than the keyboard and about a third as much as the Mac itself. I don't have a tablet; I don't personally like them much. I like my phablet, though. I gave away my Magic Trackpad - I didn't like it.

(And boy did my friends in the FOSS community curse me out for buying a Mac. I'm a traitor and a coward, apparently.)

But although I personally don't want this stuff, nonetheless, I think it's where we're going.

If adding more layers of abstraction to the system means we can remove layers of abstraction from the human-computer interface, then I'm all for it. The more we can remove, the simpler and easier and clearer the computers we can make, the better. And if we can make them really small and cheap and thus give one to every child in the poorer countries of the world — I'd be delighted.

If price was putting Microsoft and Apple out of business and destroying the career of everyone working with Windows, and replacing it all with that nasty cancerous GPL and Big-Brother-like services like Google — still worth it.

Tue, Apr. 28th, 2015, 04:07 pm
Why I'm not interest in an all-Apple solution but fancy an all-Ubuntu one

(Repurposed CIX post.)

Don’t get me wrong. I like Apple kit. I am typing right now on an original 1990 Apple Extended II keyboard, attached via a ABD-USB convertor to a Core i5 Mac mini from 2011, running Mac OS X 10.10. It’s a very pleasant computer to work on.

But, to give an example of the issues — I also have an iPhone. It’s my spare smartphone with my old UK SIM in it.

But it’s an iPhone 4. Not a lot of RAM, under clocked CPU, and of course not upgradable.

So I’ve kept it on iOS 6, because I already find it annoyingly slow and iOS 7 would cause a reported 15-25% or more slowdown. And that’s the latest it will run.

Which means that [a] I can’t use lots of iPhone apps as they no longer support iOS 6.x and [b] it doesn’t do any of the cool integration with my Mac, because my Mac needs a phone running iOS 8 to do clever CTI stuff.

My old 3GS I upgraded from iOS 4 to 5 to 6, and regretted it. It got slower & slower and Apple being Apple, *you can’t go back*.

Apple kit is computers simplified for non-computery people. Stuff you take for granted with COTS PC kit just can’t be done. Not everything — since the G3 era, they take ordinary generic RAM, hard disks, optical drives, etc. Graphics cards etc. can often be made to work; you can, with work, replace CPUs and runs OSes too modern to be supported.

But it takes work. If you don’t want that, if you just max out the RAM, put a big disk in and live with it, then it’s fine. I’m old enough that I want a main computer that Just Works and gives me no grief and the Mac is all that and it cost me under £150, used. The OS is of course freeware and so are almost all the apps I run — mostly FOSS.

I like FOSS software. I use Firefox, Adium, Thunderbird, LibreOffice, Calibre, VirtualBox and BOINC. I also have some closed-source freeware like Chrome, Dropbox, TextWrangler and Skype. I don’t use Apple’s browser, email client, chat client, text editor, productivity apps or anything. More or less only iTunes, really.

What this means is that I can use pretty much the same suite of apps on Linux, Mac and Windows, making switching between them seamless and painless. My main phone runs Android, my travelling laptop is a 2nd-hand Thinkpad with the latest Ubuntu LTS on it.

As such, many of the benefits of an all-Apple solution are not available to me — texting and making phone calls from the desktop, seamless handover of file editing from desktop to laptop to tablet, wireless transparent media sync between computers and phone, etc.

I choose not to use any of this stuff because I don’t trust closed file formats and dislike vendor lock-in.

Additionally, I don’t like Apple’s modern keyboards and trackpads, and I like portable devices where I can change the battery or upgrade the storage. So I don’t use Apple laptops and phones and don’t own a tablet. iPads are just big iPhones and I don’t like iPhones much anyway. The apps are too constrained, I hate typing on a touchscreen “keyboard” and I don’t like reading book-length texts from a brightly-glowing screen — I have a large-screen (A4) Kindle for ebooks. (Used off eBay, natch.) TBH I’d quite like a backlight on it but the big-screen model doesn’t offer one.

But I don’t get that with Ubuntu. I never used UbuntuOne; I don’t buy digital content at all, from anyone; my Apple account is around 20 years old and has no payment method set up on it. I have no lock-in to Apple and Ubuntu doesn’t try to foist it on me.

With Ubuntu, *I* choose the laptop and I can (and did) build my own desktops, or more often, use salvaged freebies. My choice of keyboard and mouse, etc. I mean, sure, the Retina iMac is lovely, but it costs more than I’m willing to spend on a computer.

Android is… all right. It’s flakey but it’s cheap, customisable (I’ve replaced web browser, keyboard, launcher and email app, something Apple does not readily permit without drastic limitations) and it works well enough.

But it’s got bloatware, tons of vendor-specific extensions and it’s not quick.

Ubuntu is sleek as Linuxes go. I like the desktop. I turn off the web ads and choose my own default apps and it’s perfectly happy to let me. I can remove the built-in ones if I want and it doesn’t break anything.

If I could get a phone that ran Ubuntu, I’d be very interested. And it might tempt me into buying a tablet.

I’ve tried all the leading Linuxes (and most of the minor ones) and so long as you’re happy with its desktop, Ubuntu is the best by a country mile. It’s the most polished, best-integrated, it works well out of the box. I more or less trust them, as much as I trust any software vendor.

The Ubuntu touch offerings look good — the UI works well, the apps look promising, and they have a very good case for the same apps working well on phone and tablet, and the tablet becoming a usable desktop if you just plug a mouse in.

Here’s a rather nice little 3min demo:

Wireless mouse turned on: desktop mode, windows, title bars, menus, etc.
Turn it off, mid-session: it’s a tablet, with touch controls. *With all the same same apps and docs still open.*
Mouse back on: it’s in desktop mode again.

And there’s integration — e.g. phone apps run full-size in a sidebar on a tablet screen, visible side-by-side with tablet apps.

Microsoft doesn’t have this, Apple doesn’t, Google doesn’t.

It looks promising, it runs on COTS hardware and it’s FOSS. What’s not to like?

I suspect, when the whole plan comes together, that they will have a compelling desktop OS, a compelling phone OS and a compelling tablet OS, all working very well together but without any lock-in. That sounds good to me and far preferable to shelling out thousands on new kit to achieve the same on Apple’s platform. Because C21 Apple is all about selling you hardware — new, and regularly replaced, too — and then selling you digital content to consume on it.

Ubuntu isn’t. Ubuntu’s original mission was to bring Linux up to the levels of ease and polish of commercial OSes.

It’s done that.

Sadly, the world failed to beat a path to its door. It’s the leading Linux and it’s expanded the Linux market a little, but Apple beat it to market with a Unix that is easier, prettier and friendlier than Windows — and if you’re willing to pay for it, Apple makes nicer hardware too.

But now we’re hurtling into the post-desktop era. Apple is leading the way; Steve Jobs finally proved his point that he knew how to make a tablet that people wanted and Bill Gates didn’t. Gates’ company still doesn’t, even when it tries to embrace and extend the iPad type of device: millions of the original Surface tablets are destined for landfill like the Atari ET game and Apple Lisa. (N.B. *not* the totally different Surface Pro, but people use it as a lightweight laptop.)

But Apple isn’t trying to make its touch devices replace desktops and laptops — it wants to sell both.

Ubuntu doesn’t sell hardware at all. So it’s trying to drag proper all-FOSS Linux kicking and screaming into the twenty-twenties: touch-driven *and* by desk-bound hardware-I/O, equally happy on ARM or x86-64, very shiny but still FOSS underneath.

The other big Linux vendors don’t even understand what it’s trying to do. SUSE does Linux servers for Microsoft shops; Red Hat sells millions of support contracts for VMs in expensive private clouds. Both are happy doing what they’re doing.

Whereas Shuttleworth is spending his millions trying to bring FOSS to the masses.

OK, what Elon Musk is doing is much much cooler, but Shuttleworth’s efforts are not trivial.

Fri, Jan. 30th, 2015, 06:14 pm
Are Macs still better than PCs, or isn't there any real difference any more?

They're a bit better in some ways. It's somewhat marginal now.

OK. Position statement up front.

Anyone who works in computers and only knows one platform is clueless. You need cross-platform knowledge and experience to actually be able to assess strengths, weaknesses, etc.

Most people in IT this century only know Windows and have only known Windows. This means that the majority of the IT trade are, by definition, clueless.

There is little real cross-platform experience any more, because so few platforms are left. Today, it's Windows NT or Unix, running on x86 or ARM. 2 families of OS, 2 families of processor. That is not diversity.

So, only olde phartes, yeah like me, who remember the 1970s and 1980s when diversity in computing meant something, have any really useful insight. But the snag with asking olde phartes is we're jaded & curmudgeonly & hate everything.

So, this being so...

The Mac's OS design is better and cleaner, but that's only to the extent of saying New York City's design is better and cleaner than London's. Neither is good, but one is marginally more logical and systematic than the other.

The desktop is much simpler and cleaner and prettier.

App installation and removal is easier and doesn't involve running untrusted binaries from 3rd parties, which is such a hallmark of Windows that Windows-only types think it is normal and natural and do not see if for the howling screaming horror abomination that it actually is. Indeed, put Windows types in front of Linux and they try to download and run binaries and whinge when it doesn't work. See comment about cluelessness above.

(One of the few places where Linux is genuinely ahead -- far ahead -- today is software installation and removal.)

Mac apps are fewer in number but higher in quality.

The Mac tradition of relative simplicity has been merged with the Unix philosophy of "no news is good news". Macs don't tell you when things work. They only warn you when things don't work. This is a huge conceptual difference from the VMS/Windows philosophy, and so, typically, this goes totally unnoticed by Windows types.

Go from a Mac to Windows and what you see is that Windows is constantly nagging you. Update this. Update that. Ooh you've plugged a device in. Ooh, you removed it. Hey it's back but on a different port, I need a new driver. Oh the network's gone. No hang on it's back. Hey, where's the printer? You have a printer! Did you know you have an HP printer? Would you like to buy HP ink?

Macs don't do this. Occasionally it coughs discreetly and asks if you know that something bad happened.

PC users are used to it and filter it out.

Also, PC OSes and apps are all licensed and copy-protected. Everything has to be verified and approved. Macs just trust you, mostly.

Both are reliable, mostly. Both just work now, mostly. Both rarely fail, try to recover fairly gracefully and don't throw cryptic blue-screens at you. That difference is gone.

But because of Windows' terrible design and the mistakes that the marketing lizards made the engineers put in, it's howlingly insecure, and vastly prone to malware. This is because it was implemented badly.

Windows apologists -- see cluelessness -- think it's fine and it's just because it dominates the market. This is because they are clueless and don't know how things should be done. Ignore them. They are loud; some will whine about this. They are wrong but not bright enough to know it. Ignore them.

You need antimalware on Windows. You don't on anything else. Antimalware makes computers slower. So, Windows is slower. Take a Windows PC, nuke it, put Linux on it and it feels a bit quicker.

Only a bit 'cos Linux too is a vile mess of 1970s crap. If it still worked, you could put BeOS on it and discover, holy shit wow lookit that, this thing is really fsckin' fast and powerful, but no modern OS lets you feel it. It's under 5GB of layered legacy crap.

(Another good example was RISC OS. Today, millions of people are playing with Raspberry Pis, a really crappy underpowered £25 tiny computer that runs Linux very poorly. Raspberry Pis have ARM processors. The ARM processor's original native OS, RISC OS, still exists. Put RISC OS on a Raspberry Pi and suddenly it's a very fast, powerful, responsive computer. Swap the memory card for Linux and it crawls like a one-legged dog again. This is the difference between an efficient OS and an inefficient one. The snag is that RISC OS is horribly obsolete now so it's not much use, but it does demonstrate the efficiency of 1980s OSes compared to 1960s/1970s ones with a few decades of crap layered on top.)

Windows can be sort of all right, if you don't expect much, are savvy, careful and smart, and really need some proprietary apps.

If you just want the Interwebs and a bit of fun, it's a waste of time and effort, but Windows people think that there's nothing else (see clueless) and so it survives.

Meanwhile, people are buying smartphones and Chromebooks which are good enough if you haven't drunk the cool-aid.

But really, they're all a bit shit, it's just that Windows is a bit shittier but 99% of computers run it and 99% of computer fettlers don't know anything else.

Once, before Windows NT, but after Unix killed the Real Computers, Unix was the only real game in town for serious workstation users.

Back then, a smart man wrote:

“I liken starting one’s computing career with Unix, say as an undergraduate, to being born in East Africa. It is intolerably hot, your body is covered with lice and flies, you are malnourished and you suffer from numerous curable diseases. But, as far as young East Africans can tell, this is simply the natural condition and they live within it. By the time they find out differently, it is too late. They already think that the writing of shell scripts is a natural act.” — Ken Pier, Xerox PARC
That was 30y ago. Now, Windows is like that. Unix is the same but you have air-conditioning and some shots and all the Big Macs you can eat.

It's a horrid vile shitty mess, but basically there's no choice any more. You just get to choose the flavour of shit you will roll in. Some stink slightly less.

Fri, Jan. 23rd, 2015, 06:02 pm
The Blackberry Passport: the last real smartphone for grownups? [Rant. Sweary. May amuse.]

I had a brief play with one on my last trip through Stansted Airport back to Czechia. I disliked the feel of the keyboard, then I realised how very fast & accurate it had been on the few test lines that I had typed.

As the only sensible-sized smartphone on the market today with an actual hardware keyboard, I'm very tempted. I'm also kinda fed up with Android.

With a little luck, my Note 2 might still have some resale value, too.

Unfortunately, all the reviews I can find are dreck like this:

BlackBerry Passport review: Getting stuff done or getting in the way?</div>
By Dan Seifert  on September 24, 2014 10:00 am  Email @dcseifert

It contains a lot of the typical bollocks that normally makes me denigrate smartphone reviews.

Whinge whinge it's too big whinge no Instagram whinge no Snapchat whinge no $shitty_proprietary_bullshit_toy_chat_app whinge videos don't look nice whinge.

No archiving in Gmail is a slight snag, but unlike Dan, I understand folders and filters and they do 99% of my archiving for me, so I don't care that much.

Well I am not a hormonal teenager who wants to give or get cock-shots. I don't give a flying fuck about Snapchat, Instagram or any of that puerile drivel.

I don't watch videos on my phone, because it's a tool not a toy, but I type on it all the time. I detest virtual keyboards. I'm a middle-aged bloke with proper big man-sized hands; I can use a Galaxy Note 2 one-handed, no problem, and if one of these many little nappy-wearing pseudo-journos with the hands of a 12 year old girl can't grip it, that's a good thing because I can't use tiny crappy toys like normal iPhones. The 6+ is the first ever iPhone that is remotely big enough to be usable to me, and it's too thin and its battery too weedy. I want an inch-thick phone with circa 5 amp-hours in it, like I had 6 or 7y ago, please, not some svelte buttonless hairdressers' phone.

So, not very helpful review, directly, inasmuch as the man-child who wrote it clearly wants something I'd perceive as a teen's plaything. I am the kind of boring old pharte with a job to do that he tries & utterly fails to imagine being.

But they're all like that, the Passport reviews. They're by bloody children who regard Flappy Bird as a mission-critical app.

But, OTOH, while Mr Still-Spattered-With-Spit-From-School there can't swap images of his small, soft and as-yet hairless genitals with his other playmates on it, he does manage to tell me that it's big, boring, solid and wide. These are good things.

My Note 2 is if anything too small. It doesn't reach from ear to mouth, as a proper phone should, it has no physical buttons, and at 2y old its battery lasts about 4-6h.

(So does its 1y old replacement battery.) But it's too wide, because it's made for watching videos on, and it wastes space on a pointless stylus when really I want it 1cm thicker with a QWERTY keyboard and in an ideal world 2 SIM slots and 2 batteries.

Really, I want a big bricklike Nokia Communicator (or at a push an HTC Universal; mine had an inch-thick 4800 mAh battery, weighed 450g & was the last smartphone I owned with a good battery life)... but with a modern OS.

Sadly, though, all the phone companies are too busy wanking over leaked pictures of Apple products and making shitty compromised me-too toys to produce something for aging adults with dimming eyesight and big hands.

I was just wondering if the last bastion of vaguely sensible boring phones had made something worth buying.

Sun, Nov. 2nd, 2014, 03:52 am
I've finally tried going through the Arch way.

I have been meaning to try Arch Linux for years.

As a former RPM user, once I finally made the switch to Ubuntu, more or less exactly 10y ago, well, since then, I have become so wedded to APT that I hesitate with non-APT distros.

My spare system on this machine is Crunchbang, which I like a lot, but is a bit too Spartan in its simplicity for me. Crunchbang is based on the stable version of Debian, which gives it one big advantage on my 2007-era built-for-Windows-Vista hardware: it uses a version of X.org so old that the ATI fglrx drivers for my Radeon HD 3470 GPU still work, which they haven't done on Ubuntu for 2 years now.

But there was a spare partition or 2 waiting. I tried Elementary -- very pretty, but the Mac OS X-ness is just skin-deep; it's GNOME 3, very simplified. No ta. Deepin is too slow and doesn't really offer anything I want -- again, it's a modification of GNOME 3, albeit an interesting one. Same goes for Zorin-OS. I've tried Bodhi before -- it's interesting, but not really pretty to my eyes. (Its Enlightenment desktop is all about eye-candy; as a desktop, it's just another Windows Explorer rip-off. If it shipped with a theme that made it look like one of those shiny floaty spinny movie-computer UIs, I might go for it, but it doesn't, it's all lairy glare that only a teenage metalhead could love.) Fedora won't even install; my partitioning is too complex for its installer to understand. SUSE is a bit bloaty for my tastes, and I don't like KDE (or GNOME 3), which also rules out PCLinuxOS and Deepin.

So Arch was the next logical candidate...

I've been a bit sheepish since an Imaginary Internet Friend, Ric Moore, tried it with considerable success a month or two ago. (As I write, he's in hospital having a foot amputated. I've been thinking of him tonight & I hope he's doing well.)

So I have finally done it. Downloaded it, burned it to a CD -- yes, it's that small -- installed it on one of my spare partitions and I am in business.

After a bit of effort and Googling, I found a simple walkthrough, used it, got installed -- and then discovered that Muktware only tells you about KDE, and assumes you'll use that and nothing else. I don't care for KDE in its modern versions, so I went with Xfce.

Getting a DM working was non-trivial but now I have LXDM -- the 3rd I tried -- and it works. I have an XFCE4 desktop with the "goodies" extras, Firefox, a working Internet connection via Ethernet, and not much else.

It does feel very quick, though, I must give it that. Very snappy. I guess now begins the process of hunting down all the other apps that I use until I've replicated all my basic toolset.

The install was a bit fiddly, much more manual than anything I've done since the mid-1990s, but actually, it all went on very smoothly, considering that it's a lot of hand-entered commands which actually do not seem to depend much on your particular config.

Sun, Jul. 27th, 2014, 07:11 pm
And now for something completely different. [Tech blog post, by me.]

[Recycled (part of) a mailing list post: another crack at trying to explain what was significant about LispMs.]

One of the much-ignored differences between different computer architectures is the machine language, the Instruction Set Architecture (ISA). It's a key difference. And the reason it doesn't get much attention is that these days, there's really only one type left: the C machine.

There used to be quite a diversity -- there were various widely-divergent CISC architectures, multiple RISC ones, Harvard versus von Neumann designs, stack machines versus register machines, and so on.

Most of that has gone now -- either completely disappeared, or shrunk into very specific niches.
Read more...Collapse )

10 most recent