Email this Article Email   

CHIPS Articles: A Brief History of Personal Computing Part 1

A Brief History of Personal Computing Part 1
By Dale Long - July-September 2002
In honor of the 20th anniversary year of CHIPS, the editors have asked me to dust off my "Way-Back" Machine and look back at how computing has progressed over the last 20 years that Chips has been in business. Wow! There probably isn't an area of human endeavor that has evolved as rapidly as computing technology has over the last 20 years. Along with that, we've also undergone some significant changes in terms of our understanding of the applications and effects of computing technologies.

As the title suggests, this will primarily be a look at personal computing, so I will focus on desktop systems. Unix, ARPANET, mainframes, and even older analog computers were a big part of computing history. However, we only have so much space. In this issue we will look primarily at the hardware side of personal computing, those desktop machines that went from home-built hot rods to the polished production models of today. In the next issue, we will look at software and explore how the personal computer revolution greatly influenced the growth of networking and related activities.

People, computers, and networking – describing this symbiotic progression is not a trivial task. It is, however, a task chock full of trivia that includes the machines that set the standards for the one-eyed monsters that sit on our desks and on our laps today. So, if you will, please sit back and join me as we take a short ride on a fast machine and review a brief history of personal computing.

The Dark Ages

To really do justice to the last 20 years, we need to start just a bit further back: the 1970s. It was during this period that the pioneers of personal computing developed the first systems designed for individual users. Many people credit Steve Wozniak and Steve Jobs, of Apple computer fame, with developing the first personal computer (PC). As the story goes, both of these bright young men were fresh out of college in 1976. Much to their dismay, they discovered that their access to the university mainframe had been terminated when they graduated. As they were both allegedly Net Trek addicts, this was an interminable situation — the Federation needed them! When they pleaded for access, they were essentially told that if they were that desperate to save the Federation from the Romulan Empire they should go build their own computer. And so 13 days later they did. The Apple I debuted in April 1976 at the Homebrew Computer Club in Palo Alto, CA. It was basically a circuit board based around the 1 megahertz MOStek 6502 chip with 8 kilobytes of RAM (expandable to 32KB) and an optional cassette tape-interface. The circuit board sold for $666.66, but you had to build your own case and plug it into a TV to get a display.

However, the Apple I wasn't actually the first personal computer on the block. The IBM 5100 Portable Computer, circa 1975, was the world's first integrated, transportable computer. At a cost of $20,000, the 5100 was the computer world's equivalent of the Duesenberg. The 5100 featured a built-in CRT monitor, tape cartridge, and included the APL and BASIC programming languages and startup diagnostics in its 48KB of read-only memory (ROM). It could hold 16KB to 64KB of plug-in RAM, used a serial input/output bus, and came with a leather case.

However, the MITS Altair 8800 microcomputer, also from 1975, is the machine that really launched the PC industry. It employed an Intel 8080 Central Processing Unit (CPU), which was originally used to control traffic lights, a standard memory of 256 bytes (yes, that's bytes, not kilobytes), four expansion slots, and 78 machine-language instructions. A kit cost $439 and a fully assembled version cost $621. MITS offered 4- and 8KB Altair versions of BASIC, the first product developed by a little startup named Microsoft run by a couple of guys named Bill Gates and Paul Allen.

Those of you who consider the IBM PC the bright center of desktop business computing take heed: the Apple II (together with VisiCalc) was what got people thinking about personal computers as business tools, not just toys. The Apple II debuted at the first West Coast Computer Faire in San Francisco in 1977. With a built-in keyboard, graphics display, eight expansion slots, and BASIC built into ROM, the Apple II was the first PC on the market simple enough for a five-year old to use. Some of its innovations included built-in high-resolution color graphics and a high-level language with graphics commands, features that were extraordinary for its day and were not matched by IBM PC clones for over than a decade. The Apple II sold for around $1298 with 16KB of RAM, a 16KB ROM, a cassette interface (that was later replaced by a floppy drive in 1978), and color graphics.

Another line of influential early computers came from Commodore. The Commodore PET (Personal Electronic Transactor) was also introduced at the 1977 West Coast Computer Faire. Commodore produced a long line of inexpensive personal computers that brought computers to the masses. The Commodore VIC-20 was the first computer to sell 1 million units, and the Commodore 64 was the first to offer a "huge" 64 KB of memory. Another old favorite was the Radio Shack TRS-80, also known as the "Trash 80" by friends and detractors alike. The base unit was essentially a thick keyboard with 4KB of RAM and 4KB of ROM. Much of the software for this system was distributed on audiocassettes read into the machine from Radio Shack cassette recorders. Trash-80s running games were a popular a draw with kids at Radio Shacks in those days in much the same way that Playstations and Nintendo demonstration machines are today.

This next example is a real piece of history, not because of the technology, but because of the business gaffe that killed it off: the Osborne 1 Portable. It boasted the low price of $1795 and Osborne was the first producer to bundle software with the machine, and in a big way. The Osborne 1 came with nearly $1500 worth of programs: WordStar, SuperCalc, BASIC, and a host of other utilities. Business was brisk until Osborne announced its next version while it still had a large inventory of Osborne 1s. The result: everyone decided to wait for the Osborne 2 instead of buying the already available Osborne 1s. The inventory sat, revenue plummeted, and Osborne declared Chapter 11 bankruptcy soon thereafter.

The last example before we enter the "Chips Age" of computing is the Xerox Star. This system was probably the most innovative PC, compared to its contemporaries, of any machine in any era of computing. The Star was a product of the Xerox Palo Alto Research Center (PARC). Chief among its innovations were the employment of a mouse as an input device and a desktop graphical user (GUI) interface with icons. It is the progenitor of all modern GUI systems, starting with Apple's Lisa and Macintosh computers, the Commodore Amiga, and eventually Windows-based PCs. However, despite its gee-whiz nature, the Star was not a commercial success, most likely because its price started at around $50,000. If the IBM 5100 was a Duesenberg, the Star was a Rolls-Royce.

The Dawn of the Chips Age

In 1981, mainframe king IBM recognized the business potential of personal computers and made the landmark announcement of the IBM PC. Thanks to an open architecture, IBM's clout, and Lotus 1-2-3 (released in 1982), the IBM PC and its offspring made desktop computing a legitimate part of the business landscape and started the ineluctable transition to the Information Age. The original IBM PC cost $3,000, and came with Intel's 16-bit 8088 chip, 64KB of RAM and a 5-1/4-inch floppy drive. A printer adapter, monochrome monitor, and color graphics adapter were available as options.

One of the business decisions that ensured the IBM PC architecture's swift proliferation was that IBM licensed it to other manufacturers. Columbia Data Products was the first to produce an IBM PC clone, but they quickly folded. However, next up to the plate was Compaq, whose "portable" model single-handedly kick-started the PC clone market. Compaq rapidly earned a reputation for engineering and quality and its computers were 100 percent IBM compatible, setting the standard for everyone who followed them.

While IBM and its allies were rapidly moving into the market, perhaps the most remarkable computer of the mid-1980s was the Apple Macintosh, introduced in 1984. There was some debate at the time as to whether the Mac was a "toy computer" for wimps who couldn't handle command line prompts or a powerful, intuitive machine that transformed the computing experience. However, 18 years later the fact that even Unix variants are looking for GUIs stands as a testament to Apple finding a way to make the groundbreaking work from Xerox PARC available (and affordable) to the average user. Weighing in at $2,495, the original Macintosh included Motorola's 16-bit 68000 microprocessor, 128KB of RAM, a high-resolution monochrome display, the Mac operating system (OS), and a single-button mouse. Apple bundled some key applications, including a little application called "MacPaint" that finally demonstrated to everyone what a mouse was good for.

Also introduced during this period was the last great Commodore system: the Amiga. The Amiga 1000 introduced the world to multimedia. For only $1200, the 68000-based Amiga 1000 did graphics, sound, and video so well that many broadcast professionals adopted it for special effects. Its ability to handle sophisticated multimedia was unique in its day for a PC, as was its multitasking, windowing OS. The Amiga reached its zenith with the 3000 series, but eventually withered on the vine. While it had great multimedia and game applications, it was considered even more of a toy than the Macintosh due to its almost complete lack of business software. Given the emphasis placed today on the graphic-intensive game-playing power of PCs, the Amiga was truly a machine well ahead of its time.

The Winner, and Still Champion…

The machine that set the standard for the majority of the PC computing that we do today was the IBM AT, also released in 1984. The AT introduced Intel's 80286 CPU running at 6 MHz and a 16-bit bus structure that gave the AT several times the raw performance of previous IBM systems. Due to dramatic drops in storage costs, basic hard drive capacity doubled from 10MB to 20MB. Oddly enough, installing two 20MB hard drives resulted in 41MB of space, which may be the first instance of a "rounding error" creeping into the computing lexicon. The 286-based AT was followed shortly thereafter by the first generation of 80386 computers. This time, however, IBM wasn't leading the way. It was distracted by its obsession with developing proprietary Micro Channel PS/2 systems. Instead, clone vendors ALR (now part of Gateway) and Compaq took control of the 80x86 architecture and introduced the first 386-based systems, the Access 386 and the Deskpro 386. Both systems maintained backward compatibility with the 286-based AT.

Apple Computer, beginning its slide into a small, but stable, niche in the computer market, paralleled the development of the 386 PC architecture with the Macintosh II line. The Mac II followed the PC idea (a rare reversal in form) of giving users a computer they could open and upgrade by themselves. Unlike previous Macs, with their integrated monitor, the 68020-powered Mac II monitor was a separate unit that typically sat on top of the CPU case. At a time when 80386 and 80486 machines were still wrestling with the limitations of DOS-based memory restrictions, the Mac II could handle up to 64MB of RAM.

Modern Times

Most of you are probably familiar with the systems that followed. As we're all aware, the IBM PC architecture became the dominant breed on the planet. The Intel 80486 followed the 386 models, but the real fun began with the first Pentium chips and their successors. On the Macintosh side, Motorola developed the 680x0 "PowerPC" processor. This was a radical departure from the dominant Intel chip architecture and the differences between the two worth noting. Intel (and their rival American Micro Devices (AMD)) processors are based on something called the Complex Instruction Set Computer (CISC) architecture, which uses microcode to execute very comprehensive instructions. Instructions may be variable in length and use all addressing modes (including direct and indirect), requiring complex circuitry in the CPU to decode them. The Motorola 860x0 processor, on the other hand, is based on the Reduced Instruction Set Computer (RISC) architecture that reduces chip complexity by using simpler instructions. Prior to the PowerMac desktop computer, RISC processors were seen solely in high-end workstations and servers. RISC-based chips eliminate the microcode layer and its associated overhead, keep instruction size constant, ban the indirect addressing mode and retain only those instructions that can be overlapped and made to execute in one machine cycle or less.

OK, so what does that mean in plain English? Think of it this way. Let's say for simplicity's sake that a CISC chip can add, subtract, multiply and divide. Multiplication and division are complex instructions that require a lot of horsepower. Now let's say that a RISC chip, on the other hand, can only add and subtract. However, because it is super-efficient, it can add and subtract so fast that its speed is not only indistinguishable from multiplication and division, but it can process the problem faster than a similarly aspirated CISC chip. For this reason, a 500MHz RISC chip will generally perform faster than a 500MHz CISC chip, and can usually keep up with a CISC processor running at twice its speed. I don't expect everyone to go out and buy Macintoshes now that you know this, but some day in the future when you find yourself in the middle of a debate between a Mac fan and a PC advocate, at least you'll understand some of what they're arguing about.

However, the Pentium swept through the PC industry faster than any of Intel's previous chips. Although Intel's 486DX (April 1989) integrated a floating-point unit (FPU) to speed math calculations and was much faster than the 386, it was the Pentium that introduced the next leap forward in the X86 microarchitecture: superscalar pipelines. Superscalar pipelining allows a processor to execute more than one instruction per clock cycle. If we think of it in terms of roads, it is like adding extra lanes to a highway. If we have three active lanes on a superhighway, we can handle three times as many cars. In processor terms, think of these three lanes as equivalent to three mathematicians adding problems independently, allowing the addition of three simultaneous non-related add instructions. As a result, a Pentium could have a 1GHz three-way superscalar processor executing, in theory, 3 billion instructions per second. Skeptics said a CISC architecture could not do this. With the Pentium, Intel proved otherwise.

Conquering Without Dividing

The Pentium did have one really famous problem, however. In 1994 there was a major flap in the media about an error the Pentium's ability to divide. Like the 486 before it, Pentium chips included a floating-point unit (FPU) to speed math calculations. Earlier Intel chips did all their arithmetic using integers. Programs that used floating-point numbers (non-integers like 2.5 or 3.14) needed to tell the chip how to divide them using integer arithmetic. The Pentium included these instructions in the chip itself in their FPUs, greatly speeding up numerical calculations. This was also the reason the Pentium was more complex and expensive than it's predecessors. The problem for Intel was that all Pentiums manufactured for almost a year had division errors built into the FPU, causing it to divide certain floating-point numbers incorrectly. At that time, many software packages, including many that used floating-point numbers, were not written to take advantage of an FPU, so they didn't show the error. Also, only certain numbers divided incorrectly. As a result, most people never personally experienced the problem.

The most famous example of the "Divide by Pentium" error could be observed by dividing 4,195,835 by 3,145,727, discovered by Tim Coe of Vitesse Semiconductors. The correct value is 1.33382 (to 5 decimal places) while the flawed Pentium's floating-point unit computed 1.33374, a relative error of 0.006 percent. While this wouldn't be a problem for calculating your checkbook balance, it was of some interest to some college mathematics professors, including Thomas Nicely, a math professor at Lynchburg College in Virginia. Professor Nicely was computing the sum of the reciprocals of a large collection of prime numbers on his Pentium-based computer. Checking his computation, he found the result differed significantly from theoretical values. He got correct results when running the same program on a computer with a 486 CPU and eventually isolated the error to the Pentium itself. After getting no real response from Intel to his initial inquiries, and after checking his facts, Nicely posted a general notice on the Internet asking others to confirm his findings. Things rolled swiftly downhill from there, including newscasts on CNN. In response to publicity about the problem, Intel announced that "an error is only likely to occur [about] once in nine billion random floating point divides," and that "an average spreadsheet user could encounter this subtle flaw once in every 27,000 years of use."

Unfortunately for Intel, their pronouncements did not engender great confidence in the computing faithful. While the chance of accidentally entering a pair of "bad inputs" was low, the Pentium's output for those inputs would be wrong every single time. Many people complained that without completely repeating calculations on other computers, they would not be able to tell if their calculations were actually falling victim to the error. As a result of the furor, IBM halted shipment on Pentium-based computers within a month and announced that "Common spreadsheet programs, recalculating for 15 minutes a day, could produce Pentium-related errors as often as once every 24 days." While PCs were only a small percentage of IBM's business and the announcement may have been motivated by someone with a mainframe background looking for any reason to cast aspersions on the PC, it sent a shockwave though the computing world.

Intel's original policy, when it finally publicly accepted responsibility for the problem, was to replace Pentium chips only for those who could explain their need of high accuracy in complex calculations. Of course, pretty much everyone decided that they fell into that category and the complaints rolled in. It took less than a month for Intel to realize that they would be far better off losing a little money than losing their entire reputation. They finally relented and gave away free replacement Pentiums for any owner who asked for one.

Since then, the PC chip race, both in price and performance, has been ratcheting up between Intel and AMD. While the initial Pentiums ran at 75MHz and 90MHz, systems with chips running at over 1GHz are commonplace now. The current race is to produce commercially viable 2GHz chips. To really understand just how far CPUs have evolved, we will close this section of our history tour with a brief comparison of processor speeds through the years.

Speed Thrills

While researching this article I found a Web site that contains a long list of processor benchmarks that starts with the Intel 386 20MHz chip and works its way up to the AMD Athlon 1200 and the Intel Pentium 4/1400. The increase in raw processing speed over the 15 years worth of systems the site lists is striking. The old 386/20 processed the test code in 854 seconds. The Athlon and Pentium 4 processed the same test in 0.43 and 0.55 seconds respectively. In comparison, if automobile speed had improved that much since the Model T, we could drive around the Equator two times in about an hour. If automobile performance improved that much again in the next 15 years, we could drive around the Equator in less than a second, providing we have some way to keep the resulting friction from frying us to a crisp. Flights of fancy aside, if you would like to see the complete list of speed benchmarks for a wide variety of chips, you can find the complete list at http://field.hypermart.net/CPU/cpu.htm.

Closing Words

This ends our look back at CPUs through the "Chips Age." I spent more time on early (some would say prehistoric) CPUs than what we currently have because of my assumption that most of you are familiar with what is available today but might not know much about where they came from and why they are (or are not) still here. In the next issue, we will tackle the softer side of computers: operating systems and applications, the stuff that really distinguishes one box with a video monitor and a keyboard from another. Now if you will excuse me, it's time to go over to Zippy's house and help him get through "The Bard's Tale" on his "antique" Apple IIe. You just can't improve some things, no matter how fast a CPU chip can do the math.

Happy Networking!

Long is a retired Air Force communications officer who has written for CHIPS since 1993. He holds a Master of Science degree in Information Resource Management from the Air Force Institute of Technology. He is currently serving as the Telecommunications Manager for the Eastern Region of the U.S. Immigration & Naturalization Service.

The views expressed here are solely those of the author, and do not necessarily reflect those of the Department of the Navy, Department of Defense or the United States government.

Related CHIPS Articles
Related DON CIO News
Related DON CIO Policy

CHIPS is an official U.S. Navy website sponsored by the Department of the Navy (DON) Chief Information Officer, the Department of Defense Enterprise Software Initiative (ESI) and the DON's ESI Software Product Manager Team at Space and Naval Warfare Systems Center Pacific.

Online ISSN 2154-1779; Print ISSN 1047-9988
Hyperlink Disclaimer