When Computers Were Cool
When I was a kid, computers were cool.
Nowadays? Not so much.
I know, I know. Grumpy old gits are a dime a dozen, and their views about the past are often filtered through a golden haze of blissful forgetfulness and nostalgia. This is of course the case for me, too, but I still think the point is valid: computers used to be much cooler - by orders of magnitude - than they are now.
The situation can perhaps be explained by an analogy: You've got a rusty old 1994 FIAT Punto and there is a rather pressing need for you to invest in a new vehicle. Of course, you can dream big - maybe you'd like some kind of armoured personnel carrier, perhaps a souped-up, pimped-out Tesla with all the bells and whistles or why not a Ferrari? Realistically, though, you're probably considering something along the lines of a brand new Ford: the kind of sensible family car that would still be an upgrade to the withering piece of junk you're currently driving. And, as soon as you've amassed enough funds, that's probably what you're going to buy. That APC, though... Man, how sweet would it be to drive one of those? Just thundering through morning traffic, fist pumping in the air, shouting crude insults at that guy with the SUV who always cuts in front of you at the intersection down by the grocery store. Not so smug now, eh, Mr. Sports Utility?
That's what it used to be like with computers.
Don't get me wrong. You can spend nearly infinite amounts of money on a computer if you'd like to - it's always been like that and probably will be for the foreseeable future. This is especially true for supercomputer clusters used in science and the mainframes keeping the banks and stock markets ticking. But when dreaming about computer power, few of us imagine having access to a behemoth like that. The things we do with our machines can only go faster up to a certain point, unless we dabble in computational biology in our spare time. It would be like replacing our feeble FIAT with an aircraft carrier: sure, it's powerful, but it's not very practical for getting to and from work.
No, the computer we dream about having on our desk is usually something a little bit faster, a little bit sleeker and just a little bit more expensive than what we can actually afford. If you're a dedicated games player, there's always the next graphics card, that extra gig of RAM, those extra few frames per second you can chase - but that's still the realistic dreaming, it's something within reach; perhaps not quite in line with a sensible Ford, but not as far out there as an APC.
And, even if you are currently dreaming of the home computer equivalent of an APC (let's pretend that's a top of the line Mac Pro, just for the sake of argument), getting one won't make much of a difference in day to day use. Sure, the machine might be faster than your current one, but except for the rare few cases when you actually utilize all that power, it won't provide a profoundly different user experience compared to what an iMac will deliver at a fraction of the cost. It's the same OS, the same applications and the same basic architecture. This is true for Windows and Linux machines as well: you can add more RAM and disk and CPU cores, but the machine won't behave in a significantly different way from what you're used to.
Now, when I was a kid... computers were cool.
In 1994, I bought a second hand Amiga 1200 with a 120 megabyte hard drive. It was a low cost, capable home computer with good sound and graphics. It was more than enough for the kind of gaming and school work a kid my age wanted to do, but the 14 MHz processor was a tad slow when it came to heavy lifting. Applying just a hint of Gaussian blur to a very low resolution JPEG file took ages. Dabbling in animation, I frequently hit the barrier of the 2 megs of RAM it came with. However, it also had a motherboard connector for adding more memory and a faster CPU. This was the reasonable dream: it was within my reach, it was the sensible family Ford.
Thus, in 1995, I got a CPU and RAM upgrade for it, making it roughly four times faster. In those days, that was a pretty hefty upgrade: speeding up from 10 to 40 MPH is more noticeable to a human compared to the difference between the 10000 and 40000 MPH speeds of today's machines. But the speed-up didn't profoundly change my user experience. I was still shuffling about with the same old software and I was still waiting around for that Gaussian blur calculation to finish - although not quite as many minutes as before.
As I sat there, watching the Gaussian blur progress bar, I of course dreamt of the computing equivalent of an Armoured Personnel Carrier. It wasn't an Amiga 4000 or a Pentium PC or one of those new-fangled PowerMacs. It was something completely different, something that would have utterly changed my all-round user experience, from boot-up to shutdown. I wanted something the likes of which actually no longer exists: I dreamt of a Unix Workstation.
Not just any old Unix box mind you, but a rather specific one: a Silicon Graphics Indy with a 24-bit frame buffer, 128 megabytes of memory and a 175 MHz MIPS R4400 CPU.
It was, hands down, just about the most maxed-out piece of hardware that could grace the top of a desk. Design-wise, computers have always been kind of beige - literally and figuratively. Black computers became commonplace some time around, say, 2000, but that was hardly a giant leap in design. Depending on your tastes, you might think companies like Apple or Alienware produce attractive machines - but then again, perhaps you've never seen an SGI Indy.
The teal blue pizza box case sports a horizontal, slightly diagonally skewed cut along the middle, shifting the top and bottom parts in a slight offset. The monitor was huge for its time: a 17" CRT cased in grey granite plastic, matching the mouse and keyboard. Both the computer and the screen were adorned with embossed SGI cube logos in a gleaming silver finish. It was over the top, maximalist 1990s more-is-more design in a strangely tasteful package and a far cry from the sleek, subdued designer machines of today. Yet, it wasn't an overstatement - it might've talked the talk, but it sure as hell could walk the walk.
For example, it didn't even have a regular floppy drive - it had a bizarre floptical unit capable of storing 20 megabytes on a single, magneto-optical disk. On top of the giant screen a webcam was poised, surely one of the absolutely first computers to come with such a device as standard. In fact, it was called an IndyCam - the term webcam hadn't really caught on yet and was more commonly used to describe just about any camera that regularly uploaded a still image to a public web server.
And that, of course, are all trivial oddities compared to the guts of the machine. Even though it was an entry-level workstation, SGI's custom hardware was capable of churning out both 2D and 3D graphics unmatched by any contemporaneous gaming PC, however expensive, and the CPU could run in circles around even the most outrageously-priced Pentium home computer available at the time.
And then there was the operating system. And the software.
IRIX, SGI's in-house Unix flavour, came with their own proprietary GUI and desktop. It was called Indigo Magic and, featuring things like scalable vector icons, animated desktop backgrounds and visual feedback cues, it was just about as outlandish as the name suggests. It should be noted that this was ten years before any PC owner had gotten the chance to grow tired of such pointless flash and that for the serious hacker, there was always a terminal emulator with a capable shell on hand.
Apart from all the usual Unix-related niceties such as stability, pre-emptive multitasking, multi-user support, excellent command line utilities and a bunch of readily available programming languages, it had an impressive array of professional productivity software. Most of it was not available on my home computer, and even if it was, the Indy could run it both faster and in higher resolution. Apart from industry standards such as Photoshop and Netscape, there was a world of curious and wonderful applications written with nothing but SGI hardware in mind: web authoring, video editing, image manipulation and graphics creation unavailable on any other platform. It was a true digital media production workhorse, an overgrown distant cousin to the machines available on the then budding multimedia PC market.
In short, it was a cool computer. Far too expensive for any home user, of course. But so. Damn. Cool.
There are computers aimed at this kind of work today as well. One of the high-end Macs mentioned before, for example, or perhaps a suitably high-powered Windows machine. In fact, pretty much any dirt cheap home computer will, pixel for pixel, do what the Indy did - except faster, cheaper and in many cases better.
Yet there isn't, today, an equivalent of the SGI Indy, or the Sun SPARCstation, or the DEC Alpha, or any of the other professional workstations. The only thing that's on offer is more of the same user experience, only slightly faster.
That's why computers are so boring these days. Because even though IRIX, Indigo Magic and the Indy by no means was a perfect solution to all of my computational desires back then, it had the lure of the unknown and unattainable: it was a goal to strive for, a source of inspiration and aspiration and a promise that better things were possible.
Today, we all know about the different quirks, mannerisms, privacy issues, drawbacks and occasional benefits of Windows update loops, overpriced Mac hardware and the tiresome fiddliness of Linux. Those are the choices we have and they're not going to change any time soon.
I'd somehow be more okay with all of this if it wasn't for the fact that, to top it all off, we have no other platform left to dream of.
That's just not cool.
from Hacker News https://ift.tt/35YAfX6
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.