Bugs!
Have you ever written on why the bug fixes in the software industry? I assume they either did not exist at all, or were not so prevalent before the advent of the internet. [...] I know console gaming has avoided this thus far, preferring to delay and test rigorously over a long period until it is as near flawless as a computer game after the last patch and the company has moved on. Although I suspect as soon as some major game comes out with a major unforeseen flaw, Microsoft et al. will allow them to use X-Box Live to correct it, opening the floodgates.
As a libertarian I am curious if some law(s) is at the root of it. Or is it just a culture thing?
I assume by now you're all familiar with the story of The Very First Computer Bug:
an unfortunate moth that met its maker while caught in the relays of the Harvard Mark II. It's a great creation myth, but it's just not true. Aircraft mechanics and radar engineers referred to bugs in their equipment during World War II, mechanical engineers talked about bugs in systems in the 1920s and 1930s, and even Thomas Edison wrote about getting the bugs out of a new invention back in the 1870s. I suspect the term has its true origins back in the early industrial age, and originally had some agricultural or wool-processing connotation, but whatever the real story is, we'll never know now.
As to why bugs seem to be increasing lately: I think it's really more a matter of perception, caused by a combination of growing exposure and increased complexity. I know for a fact that back in the days of punchcards and COBOL, programs had bugs — sometimes stunningly stupid and staggeringly show-stopping bugs — but since most programs back then were pretty close to hand-rolled one-of-a-kind customized applications, few people outside of the accounting and data processing departments knew about them. I also know that back in the days of teletypes and flowcharting templates programs still had bugs, because I wrote quite a few myself. But in those ancient days it was still possible for a programmer to sit down with a printout of the source code, a pencil, and a can of Dr. Pepper, and work through his program line by line until he reached that "Eureka!" moment. (Or more likely, that, "Omigod, I can't believe I did something that stupid," moment.)
I once worked with a guy whose mantra was, "If you can't fit your program into four thousand bytes, you don't know what you're trying to do." He was serious, too. I wonder how he's adapted to today's million-line programs? I suspect he's retired.
The great transition happened in the early 1980s, with the microcomputer revolution. In the span of less than five years computers went from being those big mysterious things in the data center that only the Lords of Cobol were allowed to approach, to being the noisy things on everybody's desk that still didn't work quite as well as their old Selectrics and calculators, but what the heck, the boss said they had to use them anyway. (You'd be amazed at how many people back in those days kept a typewriter and a printing calculator stashed away somewhere, for when they really had to get work done in a hurry.) Concurrently, software went from being something that the resident programmer/analysts had cobbled together, and that they'd fix overnight if you found something wrong, asked them nicely, and promised them cookies, to being something that you bought. On a diskette. Very often in a Ziploc bag, and accompanied by a two-pound manual in a 3-ring vinyl binder.
Packaging was very primitive in those days. Technical support, nearly nonexistent. Quality testing? Who the heck do you think we are, IBM? We're two guys working nights and weekends in a basement in Fridley, and we lucked into a distribution deal with ComputerLand.
It's also worth taking a moment now to remember just how primitive the microcomputers of the day were. My first TRS-80 Model 1 had a whopping 16 kilobytes of memory, and required an expansion chassis to be beefed up to 32 KB. (And not coincidentally, to jam every TV set and radio within a hundred yards, thanks to the naked ribbon cable that extended the data bus from the base to the expansion chassis. Shielding? What is this "shielding" you speak of?) My first Apple II+ had 48 KB of RAM, and required a third-party keyboard ROM and a haywired modification to the motherboard in order to produce lower-case letters. When I finally saved up enough to expand the memory to 64 KB and buy a second floppy disk drive, I was really living large.
This bears reiteration. The microcomputers of the day all used floppy disk drives. (When they didn't use cassette tapes. The TRS-80, Apple II, Atari 400/800, Commodore VIC-20/64, and first-model IBM-PC all had ports for cassette tape drives. Floppy disk drives were an expensive option.) Massive storage was having two floppy drives. This meant that each time the machine was powered up or rebooted, the entire operating system had to be reloaded from floppy, which in turn meant that:
a.) the OS had to be remarkably compact and therefore limited,
b.) every register and memory location in the machine was constantly being reinitalized with every power cycle,
c.) the machine essentially became a different, dedicated system each time it was rebooted and a new application loaded.
This latter aspect had advantages. For example, while working on PolyWriter (which later became Finale, which in evolved form is still on the market today), Phil Farrand, one of the authentic geniuses I've had the good fortune to work with, ran into a seemingly insoluble problem. The bug wasn't in his code; it was in Apple's DOS, and it was a show-stopper. For a few weeks he sought a workaround, trying out alternative solutions and running into brick walls every time, until finally, in frustration, he wrote his own operating system, which solved the problem.
I know, that sounds like a pretty drastic solution. But consider this: since you had to reboot the machine and reload the OS every time you wanted to use the program, why not use an alternative OS? As long as the data files you produce at the end of the day are file-compatible with the standard OS, where's the problem?
Well, it turned out the problem was that this constant rebooting and reinitializing masked a plethora of other problems, which didn't become evident until IBM introduced the XT, the first successful desktop machine equipped with a hard drive. (Apple actually beat IBM to the market with the Apple ///, but that beast can hardly be called successful. I've got two up in the loft of my garage. I used to use my Apple /// ProFile hard drive as a doorstop.) With the advent of the XT, people started turning on their computers and leaving them running for long periods of time, which began to reveal other previously hidden problems.
First up was the issue of "screen burn." It turned out that with the phosphors of the day, if you left your PC powered up and sitting on the same screen for long periods of time, the image eventually became permanently burned into the CRT. If you look around through electronic junk shops today, you can probably still find old IBM monitors with the VisiCalc frame permanently burned into the phosphors.
And thus was the screen-saver industry born...
It's tempting to take a few minutes here to wallow down Memory Lane, and think of all the companies and systems that have come and gone. Osborne, Kaypro, Altos, Corona, Commodore, Tandy, Texas Instruments, DEC: there was a time when Xerox could have owned the word-processing market, if only they'd been a little smarter — but they weren't. Wang: good grief, now there's the company that was synonymous with word-processing and office automation for the better part of a decade, but they were unable to survive the paradigm shift. Then there was the brief window in time when Tandy could have owned the PC-compatible market, if only they'd been able to resist the urge to intentionally make their hardware just slightly incompatible with industry standards, so that you were forced to buy parts and accessories from Tandy — but they couldn't, and they're gone. Ditto for DEC. Ditto for Compaq, but fortunately for them they reversed course in time.
There was a time when every journalist worth his byline packed a Tandy 100, because it was small; light; had a decently readable display, a serviceable built-in no-frills word processing program, and a built-in modem; ran for days on a set of standard AA batteries; and perhaps most importantly, because it had a full-sized but completely silent keyboard. If they'd only kept developing that machine, they could have preemptively staked out the entire mobile computing market segment. But...
The one I want to take a special moment to remember fondly now was the Amiga. When equipped with NewTek's Digital Toaster, it was at least a decade ahead of its time, and a stunningly effective non-linear video editing system. If only it'd hung on long enough to see the advent of digital cameras and streaming Internet video, it could have ruled the world.
But it didn't. They didn't. All these companies, for one reason or another, just missed the brass ring, and all their work has since vanished into the mists of history and the dusty shelves of strange little museums. Perhaps the saddest, strangest case of them all was Digital Research, the company that made CP/M, which before the introduction of the IBM PC was the operating system for serious microcomputing work. It was powerful, versatile, well-developed and thoroughly debugged, and supported a large library of useful applications.
But for some reason — and this is where history conflates with myth and apocrypha — Digital Research shied away from making the deal with IBM, and so Big Blue instead went with an outfit called Microsoft, and a cheapo little CP/M clone OS that Microsoft had bought from someone else that was originally named QDOS, but later renamed PC-DOS and then MS-DOS.
And collectively, we've all been suffering from the after-effects of this decision ever since.
...to be continued...