Saturday, October 11, 2008

More on Technology History

If Dennis and Ken had had a Selectric instead of a Teletype, we'd be typing "copy" and "remove" instead of "cp" and "rm" today.
- The Unix Haters Handbook

I've been trying to decide if the evidence really suggests that technological progress is slowing significantly, or if that is a misapprehension based on anecdotal experience.

From the comments in the nostalgia post, I started thinking about the pace of technological change in the last several years, and whether things "take longer" now than they used to.

I think this is somewhat counterintuitive. We're used to thinking, I believe, that the pace of change, and the pace of life in general, is accelerating. But I've started to think that's the opposite of the actual situation.

There's no question that change has not only slowed, but reversed, in aerospace technology. I've written about that quite a bit, and it depresses the hell out of me.

We've been accustomed, however to continued rapid advancements in information technology, thanks to Moore's law, but we've now gotten to the point that Moore's Law looks like it has its limits. (About the year 2021, according to researchers at Intel. Notice a trend here?)

Short of the limits of Moore's Law, however, is Wirth's Law, which says that software gets slower faster than hardware gets faster. That's a real no-sh1tt3r, as we sometimes say. Most commercial software today is awful crap. Did I blog about how it took me nearly a week to get Windows Vista to install and run? Once it finally did install, I let it run (or I should say crash) for about 2 hours before I wiped the machine and installed Linux.

I do think open source software reverses the trend of Wirth's Law, and its one of the few things that gives me hope for the future. I'm really, really hoping the economy of free actually turns out to be viable.

So I've been thinking about changes in technology, and my use of technology over the years. For those who may have heard these stories before, I apologize.

I started college (in 1983) with a manual typewriter - specifically the Remington Rand portable that my grandmother took to college herself in 1927. It was broken, and I had to rig up a series of pulleys and fishing weights to advance the carriage. After the first couple of term papers on that machine, I told my mother that I had to do something different, and she gave me the electric Smith Corona that she had used in college in 1962. That served me through my freshman and sophmore year until I took a few computer courses and figured out that word processing might be a big advancement.

I had used a Correcting Selectric at work before college, and thought it was the most advanced piece of technology I had ever seen, along with the Telecopier, that involved putting a piece of paper on a large metal drum and dialing a phone number, whereupon the large machine at the other end would spin around with a blank page inserted and eventually a copy of your page would appear at the other end. I was first acquainted with this technology around 1980.

Before that (in the 1970s), as I mentioned in the comments to the previous post, I had hung around the computer lab at a local university because I was fascinated by the mainframe (which was probably actually PDP-11 minicomputer, although I don't fully remember) and all the things it could do. There were huge IBM punchcard machines and large floor-mounted terminals with teletypes for display. The best function of this machine, that I could determine, was a purely text-based game called "Star Trek". In this game you would engage the Klingons by guessing their location and vulnerabilities, issuing orders to your crew via the teletype, and exchange phaser fire until one of you was toast. Afterward you could collect all the green-and-white teletype paper off the floor and review the whole game, or save it for posterity. (Which I did, but apparently posterity ended quite a while back.) This would have been around 1976-1977.

So when I started taking some computer classes in college, things had advanced somewhat, and a Vax 11/780 was the backbone of the computer science department. Much more credentialed geeks than me told me that the Vax (which ran a pretty modern version version of Unix known as VMS) was actually pretty outdated, compared to the newest personal computers. One of my wealthier friends had an original IBM PC that he didn't use because he said it was already too slow and outdated (in 1985), so he let me use it for my schoolwork, which was probably the biggest revolution in information technology I've ever experienced in my life.

So I went from a 1927 Remington Rand manual typewriter to the IBM PC in less than three years - that was a lot of change in a short time.

When I graduated, I purchased an IBM PC, with no hard drive and two 360K 5.25 inch floppies, for about $1700 (in 1986 dollars). I ran IBM PC DOS on one floppy, and stored my personal files on the other. For a long time I used a copy of an internal IBM text editor called, of course, "e". Later I graduated to WordStar, and moved on to WordPerfect (still for DOS) by the time I got to grad school. WordPerfect was the first "professional grade" word processor that I had seen, and it had lots of features I never understood.

I was still using pure-DOS, although I had obtained a 386 machine with a hard drive, a graphics card, and a color monitor, by the time I finished grad school in 1993.

At work (in 1993) we still had DOS-based machines on our desktop, although our office had a couple of high-end Unix workstations (one was a DEC and the other was a Sun, I think) that ran X-windows and Motif.

Not long after that, I was doing some pretty bleeding-edge technology work (read about it here - although that account is substantially incorrect) and started working with Windows-based PCs substantially for the first time. It was also about this time that I really started to investigate and use the internet seriously. I was unimpressed with Windows from the outset, but needed to use it for compatibility with some of the graphics programs that we used in remote sensing, and for the network file-sharing functions. At the time the dominant Windows version was 3.11 for Workgroups, which meant it had extensions for Windows networking. To get it on the internet, you had to install a third-party IP stack. I believe the one I used was called "Trumpet Winsock".

So in 10 years, I had gone from the 1927 Remington Rand manual typewriter to a laptop (substantially outwardly identical to the one I'm typing on now) running a graphical user interface connected to the internet.

Of course, that's just my personal experience, and certainly not the actual timeline of the invention of the technology (the graphical personal computer actually first came along in 1973), but my experience was not untypical, and in many ways I was ahead of the curve.

I had first used networking in 1977, when I plugged the phone into one of those those rubber-cup modems to connect to the PDP-11 and play Star Trek, and had joined Compuserve in 1986 because a disk came with the modem that I had purchased to connect to the VAX at school. Certainly most people I knew were not using personal computers at all in the 1980s, nor using the internet until the very late 1990s.

But I think the point is that there were big changes in people's use of information technology between 1983 and 1993, and more big changes (mainly related to the internet and networking) betweeen 1993 and 2003, but I now think the changes have slowed.

I just mentioned the laptop I'm using today (a 2007 Dell model) is outwardly identical to the one I used in 1994 (an IBM Thinkpad, which I still have and it still works). Both use a graphical interface on a high-resolution color LCD monitor, both connect to the internet at comparatively rapid speeds (broadband wireless today vs. 14,400 baud modem in 1994, but most of those 1994 web pages loaded as fast on the modem as the 2008 web pages load on the broadband connection).

The desktop machines I have around the house are outwardly identical to the ones from 1993, just faster with lots more storage and memory. What else has changed?

Ubiquitous broadband is an advancement - from around 1997, as is wireless networking. I first ordered an ISDN line around 1996, and had an early pre-802.11 wireless network to connect to it by 1998. By 2001 I had DSL and 802.11. Those were incremental improvements, however, over wired dialup.

Perhaps the biggest recent advancements have been in personal wireless devices, however. I first had a cell phone in my car in 1990, and had my first "pocket cell phone" around 1993, I think. Today I have a wireless device that surfs the internet (poorly) and receives GPS position overlaid on Google Earth imagery, as well as placing phone calls (poorly). But it does more than my desktop did in 1998, which is an advancement. I have this sense that personal wireless devices are hitting the wall too, however, although they have it it much more recently. My latest device (a Blackberry) is a much crappier telephone than the plain old cell phone it replaced, and I don't know what else they can add to it beyond the capabilities it already has. (I'm really looking forward to the Google Phone, though).

So the big trend of more processing power in smaller packages, which, coupled with wireless, offers consumers more and new information power, seems to be continuing at some level, but I'm just not sure for how much longer. The changes just seem to be getting smaller and more incremental. Where are my high-resolution VR goggles? Weren't they supposed to be commonplace by now?

Looking around at other kinds of technology, I just don't see much new in a long time. TV's are converging with computer networks, and old-style television will be gone soon. But the technology that's replacing it isn't new, it's just taken a very long time to supplant a technology as deeply imbedded in culture as television. And TV content itself has really gone completely to hell. We've gone from "Hawaii 5-0 (or better yet, Hawaiian Eye and Adventures in Paradise) to "Survivor: Gambia" and "Dancing with the Stars". The main thing I do with my TV is listen to satellite radio, which I suppose is another recent advancement, although it is an application much older technologies. .

Other major technologies also seem stagnant. Automobiles saw substantial advancements in reliability over the last generation, but haven't gotten significantly more efficient since the 1970s. Hybrid cars are a recent development, but so far they haven't really changed anything. If there are any areas that cry out for technological improvement, its energy and transportation, but I really get the impression that the automobile and oil industries are actually opposed to real advancement.

I wonder if my suspicion of technological stagnation makes me some kind of Luddite. It isn't that I'm not in favor of technological improvement, I just think were rapidly nearing a period of major and prolonged stagnation, which is likely to have far-reaching and poorly understood implications.

Perhaps we need it, however. Rapid changes of all kinds over the last generation have produced great strains on our society, and maybe we need a break. Maybe this stagnation, if it is real, will offer our civilization a change to "catch its breath" and "recharge its batteries", in order to enter a new period of rapid advancement in the future. I hope so.

1 comment:

MWT said...

Good article.

I think that while the actual technology itself might have stopped advancing quite so fast, we're only beginning to discover all the ways we can use it (and abuse it). I expect we'll be seeing a lot of changes to the way society works, communications-wise and everything directly and indirectly related to the communication changes, in the next decade. So we're still advancing - just in different ways.

On the other hand, if telegraphs suddenly came back into existence, our upcoming generation of people fluent in txt will know exactly what to do. ;)