Since it looks like my FB comment is about to get censored, I thought I'd repost it...
-----Gods, you are such a bunch of newbies! Only one comment out of 20 knows the actual answer.
History lesson. Sit down and shaddup, ya dumb punks.
Early microcomputers did not have a single PCB with all the components on it. They were on separate cards, and all connected together via a bus. This was called a backplane and there were 2 types: active and passive. It didn't do anything except interconnect other components.
Then, with increasing integration, a main board with the main controller logic on it became common, but this had slots on it for other components that were too expensive to include. The pioneer was the Apple II, known affectionately as the Apple ][. The main board had the processor, RAM and glue logic. Cards provided facilities such as printer ports, an 80 column display, a disk controller and so on.
But unlike the older S100 bus and similar machines, these boards did nothing without the main board. So they were called daughter boards, and the one they plugged into was the motherboard.
Then came the Mac. This had no slots so there could be no daughterboards. Nothing plugged into it, not even RAM -- it accepted no expansions at all; therefore it made no sense to call it a motherboard.
It was not the only PCB in the computer, though. The original Mac, remember, had a 9" mono CRT built in. An analogue display, it needed analogue electronics to control it. These were on the Analog Board (because Americans can't spell.)
The board with the digital electronics on it -- the bits that did the computing, in other words the logic -- was the Logic Board.
2 main boards, not one. But neither was primary, neither had other subboards. So, logic board and analog board.
And it's stuck. There are no expansion slots on any modern Mac. They're all logic boards, *not* motherboards because they have no children.
A friend of mine who is a Commodore enthusiast commented that if the company had handled it better, the Amiga would have killed the Apple Mac off.But I wonder. I mean, the $10K Lisa ('83) and the $2.5K Mac ('84) may only have been a year or two before the $1.3K Amiga 1000 ('85), but in those years, chip prices were plummeting -- maybe rapidly enough to account for the discrepancy.
The 256kB Amiga 1000 was half the price of the original 128kB Mac a year earlier.
Could Tramiel's Commodore have sold Macs at a profit for much less? I'm not sure. Later, yes, but then, Mac prices fell, and anyway, Apple has long been a premium-products-only sort of company. But the R&D process behind the Lisa & the Mac was long, complex & expensive. (Yes, true, it was behind the Amiga chipset, too, but less so on the OS -- the original CAOS got axed, remember. The TRIPOS thing was a last-minute stand-in, as was Arthur/RISC OS on the Acorn Archimedes.)
The existence of the Amiga also pushed development of the Mac II, the first colour model. (Although I think it probably more directly prompted the Apple ][GS.)
It's much easier to copy something that someone else has already done. Without the precedent of the Lisa, the Mac would have been a much more limited 8-bit machine with a 6809. Without the precedent of the Mac, the Amiga would have been a games console.
I think the contrast between the Atari ST and the Sinclair QL, in terms of business decisions, product focus and so on, is more instructive.
The QL could have been one of the imporant 2nd-generation home computers. It was launched a couple of weeks before the Mac.
But Sinclair went too far with its hallmark cost-cutting on the project, and the launch date was too ambitious. The result was a 16-bit machine that was barely more capable than an 8-bit one from the previous generation. Most of the later 8-bit machines had better graphics and sound; some (Memotech, Elan Enterprise) as much RAM, and some (e.g. the SAM Coupé) also supported built-in mass storage.
But Sinclair's OS, QDOS, was impressive. An excellent BASIC, front & centre like an 8-bit machine, but also full multitasking, modularity so it readily handled new peripherals -- but no GUI by default.
The Mac, similarly RAM deprived and with even poorer graphics, blew it away. Also, with the Lisa and the Mac, Apple had spotted that the future lay in GUIs, which Sinclair had missed -- the QL didn't get its "pointer environment" until later, and when it did, it was primitive-looking. Even the modern version is:
Atari, entering the game a year or so later, had a much better idea where to spend the money. The ST was an excellent demonstration of cost-cutting. Unlike the bespoke custom chipsets of the Mac and the Amiga, or Sinclair's manic focus on cheapness, Atari took off-the-shelf hardware and off-the-shelf software and assembled something that was good enough. A decent GUI, an OS that worked well in 512kB, graphics and sound that were good enough. Marginally faster CPU than an Amiga, and a floppy format interchangeable with PCs.
Yes, the Amiga was a better machine in almost every way, but the ST was good enough, and at first, significantly cheaper. Commodore had to cost-trim the Amiga to match, and the first result, the Amiga 500, was a good games machine but too compromised for much else.
The QL was built down to a price, and suffered for it. Later replacement motherboards and third-party clones such as the Thor fixed much of this, but it was no match for the GUI-based machines.
The Mac was in some ways a sort of cut-down Lisa, trying to get that ten-thousand-dollar machine down to a more affordable quarter of the price. Sadly, this meant losing the hard disk and the innovative multitasking OS, which were added back later in compromised form -- the latter cursed the classic MacOS until it was replaced with Mac OS X at the turn of the century.
The Amiga was a no-compromise games machine, later cleverly shoehorned into the role of a very capable multimedia GUI coomputer.
The ST was also built down to a price, but learned from the lessons of the Mac. Its spec wasn't as good as the Amiga, its OS wasn't as elegant as the Mac, but it was good enough.
The result was that games developers aimed at both, limiting the quality of Amiga games to the capabilities of the ST. The Amiga wasn't differentiated enough -- yes, Commodore did high-end three-box versions, but the basic machines remained too low-spec. The third-generation Amiga 1200 had a faster 68020 chip which the OS didn't really utilise, it had provision for a built-in hard disk which was an optional extra. AmigaOS was a pain to use with only floppies, like the Mac -- whereas the ST's ROM-based OS was fairly usable with a single drive. A dual-floppy-drive Amiga was the minimum usable spec, really, and it benefited hugely from a hard disk -- but Commodore didn't fit one.
The ST killed the Amiga, in effect. By providing an experience that was nearly as good in the important, visible ways, Commodore had to price-cut the Amiga to keep it competitive, hobbling the lower-end models. And as games were written to be portable between them both without too much work, they mostly didn't exploit the Amiga's superior abilities.
Acorn went its own way with the Archimedes -- it shared almost no apps or games with the mainstream machines, and while its OS is still around, it hasn't kept up with the times and is mainly a curiosity. Acorn kept its machines a bit higher-end, having affordable three-box models with hard disks right from the start, and focused on the educational niche where it was strong.
But Acorn's decision to go its own way was entirely vindicated -- its ARM chip is now the world's best-selling CPU. Both Microsoft and Apple OSes run on ARMs now. In a way, it won.
The poor Sinclair QL, of course, failed in the market and Amstrad killed it off when it was still young. But even so, it inspired a whole line of successors -- the CST Thor, the ICL One-Per-Desk (AKA Merlin Tonto, AKA Telecom Australia ComputerPhone
), the Qubbesoft Aurora replacement main board and later the Q40 and Q60 QL-compatible PC-style motherboards. It had the first ever multitasking OS for a home computer, QDOS, which evolved into SMSQ/e and moved over to the ST platform instead. It's now open source, too.
And Linus Torvalds owned a QL, giving him a taste for multitasking so that he wrote his own multitasking OS when he got a PC. That, of course, was Linux.
The Amiga OS is still limping along, now running on a CPU line -- PowerPC -- that is also all but dead. The open-source version, AROS, is working on an ARM port, which might make it slightly more relevant, but it's hard to see a future or purpose for the two PowerPC versions, MorphOS and AmigaOS 4.
The ST OS also evolved, into a rich multitasking app environment for PCs and Macs (MagiC) and into a rich multitasking FOSS version, AFROS, running on an emulator on the PC, Aranym. A great and very clever little project but which went nowhere, as did PC GEM, sadly.
All of these clever OSes -- AROS, AFROS, QDOS AKA SMSQ/E. All went FOSS too late and are forgotten. Me, I'd love Raspberry Pi versions of any and all of them to play with!
In its final death throes, a flailing Atari even embraced the Transputer. The Atari ABAQ could run Parhelion's HELIOS, another interesting long-dead OS. Acorn's machines ran one of the most amazing OSes I've ever seen, TAOS
, which nearly
became the next-generation Amiga OS
. That could have shaken up the industry -- it was truly radical.
And in a funny little side-note, the next next-gen Amiga OS after TAOS was to be QNX
. It didn't happen
, but QNX added a GUI and rich multimedia support to its embedded microkernel OS for the deal. That OS is now what powers my Blackberry Passport smartphone. Blackberry 10 is now all but dead -- Blackberry has conceded the inevitable and gone Android -- but BB10 is a beautiful piece of work, way better than its rivals. But all the successful machines that sold well? The ST and Amiga lines are effectively dead. The Motorola 68K processor line they used is all but dead, too. So is its successor, PowerPC.So it's the two niche machines that left the real legacy. In a way, Sinclair Research did have the right idea after all -- but prematurely. It thought that the justification for 16-bit home/business computers was multitasking. In the end, it was, but only in the later 32-bit era: the defining characteristic of the 16-bit era was bringing the GUI to the masses. True robust multitasking for all followed later. Sinclair picked the wrong feature to emphasise -- even though the QL post-dated the Apple Lisa, so the writing was there on the wall for all to see.
But in the end, the QL inspired Linux and the Archimedes gave us the ARM chip, the most successful RISC chip ever and the one that could still conceivably drive the last great CISC architecture, x86, into extinction.
Funny how things turn out.
So a regular long-term member of one of the Ubuntu lists is saying that they don't trust Google to respect their privacy. This from someone who runs Opera 12 (on Ubuntu with Unity) because they had not noticed it had been updated... for three years
I realise that I could have put this better, but...
As is my wont, I offered one of my favourite quotes:
Scott McNeally, CEO and co-founder of Sun Microsystems, said it best.
He was put on a panel on internet security and privacy, about 20y ago.
Eventually, they asked the silent McNeally to say something.
no privacy on the Internet. Get over it."
He was right then and he's right now. It's a public place. It's what it's for. Communication, sharing. Deal with it.
Run current software, follow best-practice guidelines from the like of SwiftOnSecurity on Twitter
, but don't be obsessive about it, because it is totally pointless.
You CANNOT keep everything you do private and secure and also use the 21st century's greatest communications tool.
So you choose. Use the Internet, and stop panicking, or get off it and stay off it.
Modern OSes and apps do "phone home" about what you're doing, yes, sure.
This does not make them spyware.http://www.zdnet.com/article/revealed-the-crucial-detail-that-windows-10-privacy-critics-are-missing/?tag=nl.e539&s_cid=e539&ttag=e539&ftag=TRE17cfd61
You want better software? You want things that are more reliable, more helpful, more informative?
Then stop complaining and get on with life.
No? You want something secure, private, that you can trust, that you know will not report anything to anyone?
Then go flash some open-source firmware onto an old Thinkpad and run OpenBSD
There are ways of doing this, but they are hard, they are a lot
more work, and you will have a significantly degraded experience with a lot of very handy facilities lost.
That is the price of privacy.
And, listen, I am sorry if this is not what you want to hear, but if you are not technically literate enough to notice that you're running a browser that has been out of date for 3 years, then I think that you are not currently capable of running a really secure environment. I am not being gratuitously rude here! I am merely pointing out facts that others will be too nervous to do.
run a mass-market OS like Windows 10, Mac OS X or Ubuntu with Unity and
have a totally secure private computer.
You can't. End of. It's over. These are not privacy-oriented platforms.
They do exist. Look at OpenBSD. Look at Qubes OS
But they are hard work
and need immense technical skill -- more than I have, for instance, after doing this stuff for a living for nearly 30y. And even then, you get a much poorer experience, like a faster 1980s computer or something.
As it is, after being on my CIX address for 25 years and my Gmail address for 12, all my email goes through Gmail now -- the old address, the Hotmail and Yahoo spamtraps, all of them. I get all my email, contacts and diary, all in one place, on my Mac and on both my Linux laptops and on both my Android and Blackberry smartphones. It's wonderful. Convenient, friendly, powerful, free, cross-platform and based on FOSS and compatible with FOSS tools.
But it means I must trust Google to store everything.
I am willing to pay that price, for such powerful tools for no money.
I am a trained Microsoft Exchange admin. I could do similar with Office 365, but I've used it, and it's less cross-platform, it's less reliable, it's slower, the native client tools are vastly inferior and it costs money.
Nothing much else could do this unless I hosted my own, which I am technically competent to do but would involve a huge amount of work, spending money and
still trusting my hosting provider.
You have a simple choice. Power and convenience and ease, or, learning a lot more tech skills and privacy but also inconvenience, loss of flexibility and capability and simplicity.
You run a closed-source commercial browser on what [another poster] correctly points out is the least-private Linux distro that there is.
You have already made the choice
So please, stop complaining about it. You chose. You are free to change your mind, but if you do, off to OpenBSD you go. Better start learning shell script and building from source.
Both are niche today. Conceded, yes, but… and it’s a big “but”…
It depends on 2 things: how you look at it, & possible changes in circumstances.
Linux *on the desktop* is niche, sure. But that’s because of the kind of desktop/laptop usage roles techies see.
In other niches:http://www.cnbc.com/2015/12/03/googles-chromebooks-make-up-half-of-us-classroom-devices.html
The URL explains the main story: 51% of American classroom computers are Chromebooks now. That’s a lot, and that’s 100% Linux.
And it’s happened quite quickly (in under 3y), without noise or fuss, without anyone paying a lot of attention. That’s how these changes often happen: under the radar, unnoticeably until suddenly you wake up & it’s all different.
In servers, it utterly dominates. On pocket smart devices, it utterly dominates.
But look at conventional adults’ desktops and laptops, no, it’s nowhere, it’s niche.
So, for now, on the road as private vehicles, e-cars are a small niche, yes.
But, in some role we’re not thinking about — public transport, or taxis, or something other than private cars — they might quietly gain the edge and take over without us noticing, as Chromebooks are doing in some niches.
The result, of course, is that they’re suddenly “legitimised” — there’s widespread knowledge, support, tooling, whatever and suddenly changes in some other niche mean that they’re a lot more viable for private cars.
For years, I ran the fastest computer I could afford. Often that was for very little money, because in the UK I was poor for a long time. I built and fixed and bodged. My last box was a honking big quad-core with 8GB of RAM (from Freecycle) with a dual-head 3D card (a friend’s cast-off) and lots of extras.
Then I sold, gave or threw away or boxed up most of my stuff, came over here, and had money but less space and less need to bodge. So I bought a friend’s old Mac mini. I’m typing on it now, on a 25y old Apple keyboard via a converter.
It’s tiny, silent except when running video or doing SETI, and being a Mac takes no setup or maintenance. So much less work than my Hackintosh was.
Things change, and suddenly an inconceivable solution is the sensible or obvious one. I don’t game much — very occasional bit of Portal - so I don’t need a GPU. I don’t need massive speed so a Core i5 is plenty. I don’t need removable media any more, or upgradability, or expandability.
Currently, people buy cars like my monster Hackintosh: used, cheap, but big, spacious, powerful, with lots of space in ‘em, equally capable of going to the shops or taking them to the other end of the country — or a few countries away. Why? Well that’s because most cars are just like that. It’s normal. It doesn’t cost anything significant.
But in PCs, that’s going away. People seem to like laptops and NUCs and net-tops and Chromebooks and so on: tiny, no expansion slots, often no optical media, not even ExpressCard slots or the like any more — which were standard a decade or 2 ago. With fast external serial buses, we don’t need them any more.
Big bulky PCs are being replaced by small, quiet, almost-unexpandable ones. Apple is as ever ahead of the trade: it doesn’t offer any machines with expansion slots at all any more. You get notebooks, iMacs, Mac minis or the slotless built-around-its-cooling PowerMac, incapable of even housing a spinning hard disk.
Why? When they’re this bloody fast anyway, only hobbyist dabblers change CPUs or GPUs. Everyone else uses it ’till it dies then replaces it.
Cars may well follow. Most only do urban cycle motoring: work, shops, occasional trip to the seaside or something. Contemporary electric cars do that fine and they’re vastly cheaper to run. And many don’t need ‘em daily so use car clubs such as Zipcar etc.
Perhaps the occasional longer trips will be taken up by some kind of cheap rentals, or pooling, or something unforeseen.
But it’s a profound error of thinking to write them off as being not ready yet, or lacking infrastructure, or not viable. They are, right now, and they are creeping in.
We are not so very far from the decline and fall of Windows and the PC. It might not happen, but with Mac OS X and Chromebooks and smarter tablets and convertibles and so on, the enemies are closing in. Not at the gate yet, but camped all around.
Electric vehicles aren’t quite there yet but they’re closer than the comments in this thread — entirely typically for the CIX community — seem to think.
(The title is a parody of http://www.dreamsongs.com/WIB.html
Even today, people still rail against the horrors of BASIC, as per Edsger Dijkstra's famous comment about it brain-damaging beginner programmers beyond any hope of redemption:https://reprog.wordpress.com/2010/03/09/where-dijkstra-went-wrong-the-value-of-basic-as-a-first-programming-language/
I rather feel that this is due to perceptions of some of the really crap early 8-bit BASICs, and wouldn't have applied if students learned, say, BBC BASIC or one of the other better dialects.
For example, Commodore's pathetically-limited BASIC as supplied on the most successful home computer ever, the Commodore 64, in 1982. Despite its horrors, it's remembered fondly by many. There's even a modern FOSS re-implementation of it!https://github.com/mist64/cbmbasic
I've long been puzzled as to exactly why the Commodore 64 shipped with such a terrible, limited, primitive BASIC in its ROM: CBM BASIC 2.0, essentially the 6502 version of Microsoft's MS-BASIC. It wasn't done for space reasons -- the original Microsoft BASIC fitted into 4kB of ROM and a later version into 8kB:http://www.emsps.com/oldtools/msbasv.htm
Acorn's BBC BASIC (first released a year earlier, in 1981) was a vastly better dialect.
AFAIK all the ROMable versions of BBC BASIC (BASIC I to BASIC 4.62) fitted into a 16kB ROM, so in terms of space, it was doable.http://mdfs.net/Software/BBCBasic/Versions
IOW, CBM had enough room; the C64 kernal+BASIC were essentially those of the original PET, and fitted into an 8kB ROM, I think. And the C64 shipped after
the B and P series machines, the CBM-II. OK, CBM BASIC 4 wasn’t much
of an improvement, but it was better.
Looking back years later, and reading stuff like Cameron Kaiser’s “Secret Weapons of Commodore” site:http://www.floodgap.com/retrobits/ckb/secret/
… it seems to me that Commodore management never really had much of an idea of what they were doing. Unlike companies such as Sinclair or Acorn, labouring for years over tiny numbers of finely-honed models, in the 8-bit era, Commodore had multiple teams designing dozens of models of all sorts of kit, often conflicting with one another, and just occasionally chose to ship certain products and kill others — sometimes early, sometimes when it was nearly ready and the packaging was being designed.
(Apple was similar, but at a smaller scale — e.g. the Apple /// competing with the later Apple ][ machines, and the Mac competing with the Lisa, and then the Apple ][GS competing with the Mac.)
There were lovely devices that might have thrived, such as the C65, which were killed.
There were weird, mostly inexplicable hacked-together things, such as the C128, a bastard of a C64, plus a slightly-upgraded C64, plus, of all things, a CP/M micro based around an entirely different an totally incompatible processor, so the C128 had two: a 6502 derivative and
a Z80. Bizarre.
There were determined efforts to enhance product lines whose times were past, such as the CBM-II machines, an enhanced PET when the IBM PC was already taking over.
There were odd half-assed efforts to fix problems with released products, such as the C16 and Plus-4, which clearly showed that management didn’t understand their own successes: the C64 was an wildly-successful upgrade of the popular VIC-20, but rather than learn from that and do it again, Commodore did something totally different and incompatible, launched with some fanfare, and appeared mystified that it bombed.
It’s a very strange story of a very schizophrenic company.
And of course, rather than develop their own successor for the 16-bit era, they bought it in — the Lorraine, later the Amiga, a spiritual successor to the Atari 8-bit machines, which themselves were inspired kit for their time.
This leaving Atari in the lurch, but to which the company responded in an inspired way with the ST: an clever mixture of off-the-shelf parts -- PC-type where that was good enough (e.g. graphics controller), or from the previous generation of 8-bits (e.g. sound chip), plus a bought-in adapted OS (Digital Research's GEMDOS plus GEM, never crippled like the PC version was due to Apple's lawsuit, meaning PC disk formats and file compatibility. And of course the brilliant inclusion of MIDI ports, foreseeing an entire industry that was around the corner.
The ST is what the Sinclair QL should have been: a cheap, affordable, usable 16-bit computer. Whereas the poor doomed QL was Sinclair doing its trademark thing too far: a 16-bit machine cut down to the point that it was no better than a decent 8-bit machine.
Whereas now, almost all the diversity is gone. Today, we just have generic x86 boxes and occasional weird little ARM things, and apart from some research or hobbyist toys, just 2 OS families -- Windows NT or some flavour of Unix.
There are moves afoot to implement desktop apps inside containers on Linux -- e.g.https://wiki.gnome.org/Projects/SandboxedApps/Sandbox
This is connected with the current uptake of Docker. There seems to be a lot of misunderstanding about Docker, exemplified by a mailing list post I just read which proposes running different apps in different user accounts instead and accessing them via VNC. This is an adaptation of my reply.
Docker is a kind of standardized container for Linux.
Containers are a sort of virtual machine.
Current VMs are PC emulators for the PC: they virtualise the PC's hardware, so you can run multiple OSes at once on one PC.
This is useful if you want to run, say, 3 different Linux distros, Windows and Solaris on the same machine at once.
If you run lots of copies of the same OS, it is very inefficient, as you duplicate lots of code.
Containers virtualise the OS instead of the computer. 1 OS instance, 1 kernel, but to the apps running on that OS, each app has its own OS. Apps cannot see other apps at all. The virtualisation means that each app thinks it is running standalone on the OS, with nothing else installed.
This means that you can, say, run 200 instances of Apache on 1 instance of Linux, and they are all isolated. If one crashes, the others don't. You can mix versions, have custom modules in one that the others don't have, etc.
All without the overhead of running 200 copies of the OS.
Containerising apps is a security measure. It means that if, say, you have a compromised version of LibreOffice that contains an exploit allowing an attacker to get root, they get root in the container, and as far as they can see, the copy of LibreOffice is the only thing on the computer. No browser, no email, no stored passwords, nothing.
All within 1 user account, so that this can be done for multiple users, side-by-side, even concurrently on a multiuser host.
It is nothing to do with user accounts; these are irrelevant to it.Gobo
's approach to bundling apps mainly just brings benefits to the user: an easier-to-understand filesystem hierarchy, and apps that are self-contained not spread out all over the filesystem. Nice, but not a killer advantage. There's no big technical
advantage and it breaks lots of things, which is why Gobo needs the gobohide kernel extension and so on. It's also why Gobo has not really caught on.
But now, containers are becoming popular on servers. It's relatively easy to isolate server apps: they have no GUI and often don't interact much with other apps on the server.
Desktop apps are much harder to containerise. However, containerising them brings lots of other advantages -- it could effectively eliminate the differences between Linux distributions, forever ending the APT-vs-RPM wars by making the packaging irrelevant, while delivering much improved security, granularity, simplicity and more.
In theory all Gobo's benefits at the app level
(the OS underneath is the same old mess) plus many more.
It looks like it might be something that will happen. It will have some side-effects -- reducing the ease of interapp communication, for instance. It might break sound mixing, or inter-app copy-and-paste, system browser/email/calender integration and some other things.
And systems will need a lot more hard disk space.
But possibly worth it overall.
One snag at present is that current efforts look to require btrfs, and btrfs is neither mature nor popular at the moment. This might mean that we get new filesystems with the features such sandboxing would need -- maybe there'll be a new ext5 FS, or maybe Bcachefs will fit the bill. It's early days, but the promise looks good.
I was just prodded by someone when suggesting that some friends try Linux. I forgot to mention that you can try it without risking your existing PC setup. It prompted me to write this...
I forget that non-techies don't _know_ stuff like that.
Download a program called VirtualBox. It's free and it lets you run a whole other operating system - e.g. Linux - under Windows as a program. So you can try it out without affecting your real computer.https://www.virtualbox.org/
If all you know is Windows, I'd suggest Linux Mint: http://www.linuxmint.com/
It has a desktop that looks and works similarly to Windows' classic pre-Win8 look & feel.
Google for the steps but here's the basic instructions:
 Download and install VirtualBox
 Then download the Virtualbox Extensions from the same site. Double-click the extensions file to install it into Vbox. (They have to do it this way for copyright reasons.)
 Download Mint. It comes as an ISO file, an image of a DVD.
 Make a new VM in VBox. Give it 2-3 gig of RAM. Enable display 3D acceleration in the settings. (Remember, anything you don't know how to do, Google it.) Leave all the other settings as they are.
 Start your new VM. It will ask for an ISO file. Point it at the ISO file of Mint you downloaded.
 It will boot and run. Install it onto the virtual hard disk inside Vbox. Just accept all the defaults.
 Reboot your new Mint VM.
 Install the Vbox Guess Additions. On the VBox Device menu, choose “Insert Guest Additions ISO”. Google for instruction on how to install them.
 When it’s finished, reboot the VM.
 Update your new copy of Linux Mint. (Remember, Google for instructions.)
That’s it. Play with it. See if you can do the stuff you normally do on Windows all right. If you can’t, Google for what program to use and how to install it. It’s not as quick as a real PC but it works.
Don’t assume that because you know how to do something on Windows, it works that way on Linux. E.g. you never should download programs from a website and install them into Linux — it has a better way. Be prepared to learn some stuff.
If you can work it, then you can install it on your PC alongside Windows. This is called Dual Booting. It’s quite easy really and then you choose whether you want Windows or Linux when you turn it on.
All my PCs do it, but I use Windows about once or twice a year, when I absolutely need it. Which is almost never. I only use Windows if someone is paying me too — it is a massive pain to maintain and keep running properly compared to more grown-up equivalents. (Linux and Mac OS X are based on a late-1960s project; they are very mature and polished. The first version of the current Windows family is from 1993. It’s still got a lot of growing up to do — it’s only half the age.)
It’s genuinely better. No, you don’t get all the Windows programs. There aren’t many games for it, for instance. But it can do anything Windows can do, it’s faster, it’s immune to all the Windows viruses and nasties so you don’t need antivirus or a firewall or anything. That means it’s faster, too — antivirus slows computers down, but you need it on Windows.
All the apps are free. All the updates are free, forever. There are thousands of people on web fora who will help you if you have problems, you just have to ask. It’s educational — you will learn more about computers from learning a different way to use them, but that means you won’t be so helpless. You don’t need to be a white-coated genius scientist, but what it means is you take control back from some faceless corporation. Remember, the world’s richest man got that way by selling people stuff they could have had for free if they just knew how.