Opinion
Architecture and Hardware Historical reflections

The IBM PC: From Beige Box to Industry Standard

Looking back at three decades of PC platform evolution.
Posted
  1. Introduction
  2. Components of the PC Story
  3. Conclusion
  4. Further Reading
  5. References
  6. Author
  7. Figures
IBM PC/XT
The IBM PC/XT circa 1983.

The IBM personal computer was 30 years old last year. The IT world is more interested in the future than the past, so industry pundits used the anniversary primarily to ponder the end of the PC era: the future, they tell us, belongs to phones, tablets, and clouds rather than beige desktop boxes. Yet even if the 400 million or so PCs sold in 2011 were the last ever made it is still clear that no other computer architecture has ever been so important for so long. The PC evolved from a single machine to an industry standard, not just for desktop computers but for notebooks, workstations, and servers. Whether you run Windows, Linux, or even (since 2006) Mac OS you are probably running it on this platform.

But what really do we mean by a “PC” anyway? The Lenovo laptop I used to write this column does not look or act very much like the IBM system I could have received for my ninth birthday, had my parents been able to afford more than the Sinclair ZX81 that actually launched my computing career. And the genuine IBM Portable PC tucked behind my filing cabinet is rarely the machine I reach for first when leaving on a research trip.

No detail of the original PC remains unchanged in its modern descendents, except perhaps the vestigial row of pins still found on some motherboards to beep an internal speaker. This recalls the apocryphal story of a label in the Tower of London: “Axe, 12th century (shaft 14th century, head 15th century).” Its buses, sockets, and interfaces have changed, though some capabilities are functionally preserved for backward compatibility though emulation, virtualization, or legacy modes. The results are impressive—I once successfully booted a 1999 PC with a PC-DOS 1.1 diskette from 1982.

Back to Top

Components of the PC Story

The story of the PC platform has five main chapters, and in each one it meant something different. For a year after its very successful 1981 launch the PC platform was a single proprietary computer model, albeit one built largely from standard parts to control costs and speed its introduction. In this it resembled earlier personal computers, such as the Apple II launched four years earlier. The creation of the PC has been told and retold many times by journalists, so that IBM’s fateful alliance with Microsoft has taken on the character of an origin myth for the modern computer industry.

In the second chapter the PC platform broadened. Buyers could choose between an IBM PC, the more powerful PC/XT (1983) and PC/AT (1984) or a “clone” from another company. Each IBM machine set a new de facto standard, which was quickly replicated by clones claiming to be “100% compatible” with the real thing, even replicating its foibles. The shift was captured by PC Magazine, when in 1986 it changed its subtitle from “the independent guide to IBM personal computers” to “the independent guide to IBM-standard personal computing.” CP/M, the previous standard platform for business microcomputers, had run on a diverse range of hardware. In contrast, attempts by companies such as Apricot and DEC to improve their MS-DOS computers at the expense of compatibility were ultimately rejected by the market. The new platform was the combination of IBM hardware and DOS, rather than either in isolation. This “standard” was defined operationally by trying to run Lotus 1-2-3 and Microsoft Flight Simulator, both of which gave the quirks of the IBM hardware a thorough workout.

The third chapter began in 1987 when IBM replaced all its existing personal computer models with a new Personal System/2 range. Reporting the launch, the New York Times quoted Steve Ballmer’s opinion that this was “the most important introduction in the short history of personal computers. It’s the computer architecture for the next decade.”5 IBM expected clone makers to start paying license fees to copy the new and heavily patented Micro Channel Architecture bus and other system features. This made sense. The clone industry had worked well to faithfully copy IBM’s models and drive down costs, but its biggest early innovation had been putting a handle on the box and squeezing in a tiny screen to create the 28-pound Compaq Portable.

Few expected the PC/AT architecture to survive for long once IBM abandoned it. A Gartner Group analyst was quoted saying “if the IBM ‘clone’ companies hope to keep their share of the corporate market, they’ll have to match IBM’s new personal computer architecture.”2 Yet it was IBM that found itself isolated, and by 1994 it had lost its number-one position in the personal computer market. The long-defunct PC/AT evolved into the basis of a constantly evolving informal standard, under the control of no single company. Clone companies eventually duplicated many of the features IBM introduced with its PS/2 machines, including 3.5-inch disk drives, and new connectors for keyboards and mice, but never adopted the overall PS/2 architecture. The clone firms tried to push “Industry Standard Architecture” as a new term for their platform, but the machines were still generally called “IBM Compatible” even though it was no longer clear what this meant. IBM’s own machines were now less compatible than their competitors.

No longer could any single company redefine the PC platform. Yet this was the chapter in which it advanced most dramatically and crushed spirited competition from Apple, Atari, Commodore, and other proprietary alternatives. By the end of this third chapter in 1996 the processor power of a high-end PC (by then a Pentium-based machine) had risen by a factor of around 100 compared to the original PC/AT. Large screens driven by high-performance graphics cards on 32-bit buses had replaced slow and ugly EGA displays. Hard disk capacities had risen from 30 megabytes to several gigabytes. RAM capability had grown from 512KB to 32MB, with high-quality audio and CD-ROM drives giving new multimedia power. The operating system advanced from DOS 3.3 to Windows 95 (or the more robust NT 4.0) and the PC finally overcame legacy limitations in its use of memory that had frustrated earlier users.


Tiny choices made by the original PC/AT designers imposed fundamental constraints 10 years later.


This innovation took place incrementally and with no central point of control. As a historian of technology and business I find this the most interesting and understudied part of the story. PC companies of the era had little in common with traditional high-technology manufacturers. PC/AT clones of the late 1980s were assembled from a handful of standard parts, each typically built by a different specialist firm. One was the motherboard, into which a processor and memory chips were inserted. This was screwed into a case, along with a power supply. Several key functions were performed by expansion cards plugged into 16bit slots. A typical configuration filled three slots with a display adapter, a combined parallel and serial card to drive a printer and modem, and a disk controller card. Hard and floppy disks filled two drive bays of standard dimensions. All of these parts, including the motherboard, were available in dozens of variants from different hardware suppliers. (This also facilitated the exodus of component manufacturing to Asia, as suppliers could focus on low-cost niches without needing to engineer or market a whole computer).

Moderately knowledgeable computer users saved money by building their own computers, guided by books or evening classes. This could be done in an hour using no tool more exotic than a Phillips head screwdriver. The more commercially minded built to order for their friends and eventually set up computer businesses in neighborhood storefronts or, as Michael Dell famously did, in his dorm room. A PC clone “manufacturer” needed to procure only one custom part: a small badge bearing its logo (to be stuck in the standard depression on the top left corner of the case). Everything else could be ordered from a catalogue.

The physical structure of the PC/AT came to define the market structure of the personal computer industry. A PC company, even a big one like Dell, did little original development work. It enhanced its models continually by procuring improved components. Improved parts would pop right in, but only if they precisely fitted the constraints and interfaces evolved from the original PC/AT. Within this framework some kinds of innovation were easy, others hard, and some impossible.

The easiest improvements involved substitution of a single component with no changes to interfaces or the structure of the overall system: faster processors, bigger hard drives, or new components like sound cards. One of the most dramatic component innovations came from Compaq, the largest of the clone makers, when in 1986 it stopped waiting for IBM to make something new to copy and launched the Deskpro 386. This shoehorned Intel’s new processor into the existing architecture of the PC/AT. At the time this looked like a stopgap solution. Newsweek called it a “calculated risk” and noted that “customers may choose to wait for IBM, fearing that any other computer will be incompatible.” However Compaq’s engineering was soon duplicated by specialist motherboard producers as a way of keeping the clone ecosystem refreshed with new technology.

Other changes required producers of several components to work together. This was more difficult, but not impossible. For example, the original PC/AT used a complex and expensive hard disk controller. This set a de facto standard (known as ST-506) and rival firms toiled to produce compatible drives and controllers that were better and cheaper. But by the end of the 1980s PC clones were moving over to the new IDE standard, which shifted most of the control electronics onto the drive itself to lower costs, improve performance, and supported much larger drive sizes. Drive manufacturers cooperated with motherboard producers (and the producers of chipset and BIOS components who supplied them), computer assemblers, and of course Microsoft to incorporate the necessary changes. Another big change was the introduction of several rival higher speed bus standards (EISA, PCI, and VLB), all of which required motherboard and expansion card producers to adopt new technology but preserved the physical dimensions of the card itself.

The most difficult thing to change was what would, in a more conventional kind of high-technology product, have been the easiest: the case design. The standard power supply occupied the right rear corner, with an inconveniently located switch. The case and the motherboard would invariably be produced by different companies. This was not a problem, as long as the motherboard was not too large for the case and had its mounting holes in the right places. It could shrink a little, but not too much as it had to reach the mounting holes and extend far enough to align its expansion slots and keyboard connector correctly with the cutouts in the back of the case. This determined the minimum width of the case and the height was set by the height of the expansion cards, so PCs remained bulky even as workstations shrank into fashionable “pizza boxes.” Any use of custom components raised costs and limited flexibility, though cases did eventually start to feature power switches on the front and special rear mountings for mouse and printer sockets. Even the popular “mini-tower” format just took the traditional desktop layout, shortened it, and turned it sideways.

Tiny choices made by the original PC/AT designers imposed fundamental constraints 10 years later. For example, graphics, network, and sound controllers were rarely integrated onto the motherboard. Why? Not because of any technological limit, but simply because there were no holes on the back of the standard case through which they could protrude to the outside world. On the other hand, hard and floppy disk controllers were widely integrated by this point as the standard connector could simply be routed inside the case to the motherboard.

The fourth chapter opens with the appearance of machines based on Intel’s new ATX motherboard format in 1996. Intel’s dominance had been growing from processors into other key motherboard components, so for the first time since IBM had abandoned the PC/AT architecture almost a decade earlier there was a company with the power to introduce a successful new standard format. This revised case design, power supply connections, and system board layout. It included a large and flexible case opening so that peripherals could be connected directly to motherboard sockets. The final major constraint from the original IBM designs had been removed. More and more hardware was merged onto a handful of motherboard chips from Intel or one of a handful of rivals, typically including graphics, sound, and networking functions. People stopped talking about “IBM PC Compatible” computers and spoke instead just of PCs or, more revealingly, of Wintel computers after their key features of Intel x86-based hardware and Microsoft Windows operating systems.

We live in the fifth and perhaps final chapter, which began in the mid-2000s when laptop PCs (and later netbooks) began to outsell desktops. Desktop PCs will soon be akin to V8 engines or tube amplifiers—used only by hobbyists and those obsessed with performance over practicality. But the PC’s fundamental architecture lives on in new packages.

Back to Top

Conclusion

The history of technology includes many landmark products and many successful standards. However, the IBM PC is perhaps unique in evolving seamlessly from a single proprietary product to an open standard and the foundation of an entire global industry. Some ideas developed by historians are useful in understanding this—for example, the idea of backward compatibility long predates the computer, as historian Thomas Hughes showed when he created the term “technological momentum” to describe the power of established electrical power systems.1 The special characteristics of computer systems, in particular the rapid growth in power, the layering of technologies, and the use of emulation to retain compatibility have given the PC a new kind of evolutionary flexibility.

At the same time, this story reminds us of the importance of paying attention to the mundane details of technology. Who would have guessed that the case layout would be the last main feature of IBM’s design to be retained unchanged, and a major constraint on the development of the platform? Yet of course a hole, unlike a processor addressing mode or API, is something that cannot be retained virtually in a special compatibility mode. It is either there or not. As sociologists of technology such as Donald MacKenzie and Trevor Pinch have recently reminded us, no technology can truly transcend the joys and trials of this material world.3,4

Back to Top

Further Reading

The creation of the original PC is told from an IBM perspective in Chposky, J. and Leonsis, T. Blue Magic: The People, Power, and Politics Behind the IBM Personal Computer. Facts on File, NY, 1988.

For a broader and more entertaining look on the early years of the personal computer industry, including insightful analysis of the rise of the clone industry, see Cringely, R.X. Accidental Empires: How the Boys of Silicon Valley Make their Millions, Battle Foreign Competition, and Still Can’t Get a Date. Addison-Wesley, Reading, MA, 1992.

Historians have started to think about the rise of the PC as a standard in Sumner, J. “Standard and Compatibility: The Rise of the PC Computing Platform.” In “By whose standards? Standardization, stability and uniformity in the history of information and electrical technologies.” Volume 28 of History of Technology, J. Sumner and G.J.N. Gooday, Eds., Continuum, London, 2008, 101–127.

Back to Top

Back to Top

Back to Top

Figures

UF1 Figure. The IBM PC/XT circa 1983.

Back to top

    1. Hughes, T. Networks of Power: Electrification in Western Society, 1880–1930. Johns Hopkins University Press, Baltimore, MD, 1983.

    2. Knobelsdorff, K.E. IBM's four-month-old PS/2 has put computer world on hold. Christian Science Monitor (Aug. 19, 1987)

    3. Mackenzie, D. Material Markets: How Economic Agents are Constructed. Oxford University Press, NY, 2009

    4. Pinch, T. and Swedberg, R. Eds., Living in a Material World. MIT Press, Cambridge, MA, 2008.

    5. Sanger, D.E. IBM offers a blitz of new PC's. New York Times (Apr. 3, 1987)

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More