From Wikipedia, the free encyclopedia
Many early IBM PC compatibles used the same computer bus as the original PC and AT models. The IBM AT compatible bus was later named the Industry Standard Architecture bus by manufacturers of compatible computers. The term "IBM PC compatible" is now a historical description only, since IBM has ended its personal computer sales.
Descendants of the IBM PC compatibles comprise the majority of personal computers on the market presently, although interoperability with the bus structure and peripherals of the original PC architecture may be limited or non-existent.
Contents
Origins
IBM decided in 1980 to market a low-cost single-user computer as quickly as possible in response[citation needed] to Apple Computer's success in the burgeoning microcomputer market. On 12 August 1981, the first IBM PC went on sale. There were three operating systems (OS) available for it. The least expensive and most popular was PC DOS made by Microsoft. In a crucial concession, IBM's agreement allowed Microsoft to sell its own version, MS-DOS, for non-IBM computers. The only proprietary component of the original PC architecture was the BIOS (Basic Input/Output System).IBM at first asked developers to avoid writing software that addressed the computer's hardware directly, and to instead use the BIOS.[1] Software which directly addressed the hardware instead of making standard calls was faster, however. This was particularly relevant to games. The IBM PC was sold in high enough volumes to justify writing software specifically for it, and this encouraged other manufacturers to produce machines which could use the same programs, expansion cards, and peripherals as the PC. The 808x computer marketplace rapidly excluded all machines which were not functionally very similar to the PC. The 640 kB barrier on "conventional" system memory available to MS-DOS is a legacy of that period; other non-clone machines did not have this limit.
Rumors of "lookalike", compatible computers, created without IBM's approval, began almost immediately after the IBM PC's release.[2] By June 1983 PC Magazine defined "PC 'clone'" as "a computer [that can] accommodate the user who takes a disk home from an IBM PC, walks across the room, and plugs it into the 'foreign' machine".[3] Because of a shortage of IBM PCs that year, many customers purchased clones instead.[4] Columbia Data Products produced the first computer more or less compatible to the IBM PC standard during June 1982, soon followed by Eagle Computer. Compaq announced its first IBM PC compatible in November 1982, the Compaq Portable. The Compaq was the first sewing machine-sized portable computer that was essentially 100% PC-compatible. The company could not copy the BIOS directly as a result of the court decision in Apple v. Franklin, but it could reverse-engineer the IBM BIOS and then write its own BIOS using clean room design.
Compatibility issues
At the same time, many manufacturers such as Xerox, Hewlett-Packard, Digital Equipment Corporation, Sanyo, Texas Instruments, Tulip, Wang and Olivetti introduced personal computers that were MS DOS compatible, but not completely software- or hardware-compatible with the IBM PC.Like IBM, Microsoft's intention was that application writers would write to the application programming interfaces in MS-DOS or the firmware BIOS, and that this would form what would now be termed a hardware abstraction layer. Each computer would have its own Original Equipment Manufacturer (OEM) version of MS-DOS, customized to its hardware. Any software written for MS-DOS would operate on any MS-DOS computer, despite variations in hardware design.
This expectation seemed reasonable in the computer marketplace of the time. Until then Microsoft was based primarily on computer languages such as BASIC. The established small system operating software was CP/M from Digital Research which was in use both at the hobbyist level and by the more professional of those using microcomputers. To achieve such widespread use, and thus make the product viable economically, the OS had to operate across a range of machines from different vendors that had widely varying hardware. Those customers who needed other applications than the starter programs could reasonably expect publishers to offer their products for a variety of computers, on suitable media for each.
Microsoft's competing OS was intended initially to operate on a similar varied spectrum of hardware, although all based on the 8086 processor. Thus, MS-DOS was for several years sold only as an OEM product. There was no Microsoft-branded MS-DOS: MS-DOS could not be purchased directly from Microsoft, and each OEM release was packaged with the trade dress of the given PC vendor. Malfunctions were to be reported to the OEM, not to Microsoft. However, as "compatibles" became widespread, it soon became clear that the OEM versions of MS-DOS were virtually identical, except perhaps for the provision of a few utility programs.
MS-DOS provided adequate functionality for character-oriented applications such as those that could have been implemented on a text-only terminal. Had the bulk of commercially important software been of this nature, low-level hardware compatibility might not have mattered. However, in order to provide maximum performance and leverage hardware features (or work around hardware bugs), PC applications quickly evolved beyond the simple terminal applications that MS-DOS supported directly. Spreadsheets, WYSIWYG word processors, presentation software and remote communication software established new markets that exploited the PC's strengths, but required capabilities beyond what MS-DOS provided. Thus, from very early in the development of the MS-DOS software environment, many significant commercial software products were written directly to the hardware, for a variety of reasons:
- MS-DOS itself did not provide any way to position the text cursor (except to advance it after printing each letter). While the BIOS video interface routines were adequate for rudimentary output, they were inefficient; they did not have "string" output (only output by individual character) and they inserted delay periods to compensate for CGA hardware "snow" (a display artifact of CGA cards produced when writing directly to screen memory)-- an especially bad artifact since they were called by IRQs, thus making multitasking very difficult. A program that wrote directly to video memory could achieve output rates 5 to 20 times faster than making standard calls to the BIOS and MS-DOS. Turbo Pascal used this technique from its earliest versions.
- Graphics capability was not taken seriously in the original IBM design brief; graphics were considered only from the perspective of generating static business graphics such as charts and graphs. MS-DOS did not have an API for graphics, and the BIOS only included the most rudimentary of graphics functions (such as changing screen modes and plotting single points). To make a BIOS call for every point drawn or modified also increased overhead considerably, making the BIOS interface notoriously slow. Because of this, line-drawing, arc-drawing, and blitting had to be performed by the application to achieve acceptable speed, which was usually done by bypassing the BIOS and accessing video memory directly.
- Video games, even early ones, mostly required a true graphics mode. They also performed any machine-dependent trick the programmers could think of in order to gain speed. Though initially the major market for the PC was for business applications, games capability became an important factor motivating PC purchases as prices decreased. The availability and quality of games could mean the difference between the purchase of a PC compatible or a different platform like the Amiga.
- Communications software directly accessed the UART serial port chip, because the MS-DOS API and the BIOS did not provide full support and was too slow to keep up with hardware which could transfer data at 19200 bit/s.
- Even for standard business applications, speed of execution was a significant competitive advantage. Integrated software Context MBA preceded Lotus 1-2-3 to market and included more functions. Context MBA was written in UCSD p-System, making it very portable but too slow to be truly usable on a PC. 1-2-3 was written in x86 assembly language and performed some machine-dependent tricks. It was so much faster that it quickly surpassed Context MBA's sales.
- Disk copy-protection schemes, in common use at the time, worked by reading nonstandard data patterns on the diskette to verify originality. These patterns were impossible to detect using standard DOS or BIOS calls, so direct access to the disk controller hardware was necessary for the protection to work.
- Some software checked for evidence of a genuine IBM BIOS, such as an IBM copyright notice.[5]
The decreasing influence of IBM
As the IBM PC market grew IBM's influence diminished. In November 1985 PC Magazine stated "Now that it has created the [PC] market, the market doesn't necessarily need IBM for the machines. It may depend on IBM to set standards and to develop higher-performance machines, but IBM had better conform to existing standards so as to not hurt users".[13] In January 1987 BYTE wrote of rumors that IBM would introduce proprietary personal computers with a proprietary operating system: "Who cares? If IBM does it, they will most likely just isolate themselves from the largest marketplace, in which they really can't compete anymore anyway". The magazine predicted that in 1987 the market "will complete its transition from an IBM standard to an Intel/MS-DOS/expansion bus standard ... Folks aren't so much concerned about IBM compatibility as they are about Lotus 1-2-3 compatibility".[14] By 1988 Gartner Group estimated that the public purchased 1.5 clones for every IBM PC.[15]After 1987, IBM PC compatibles dominated both the home and business markets of commodity computers,[16] with other notable alternative architectures being used in niche markets, like the Macintosh computers offered by Apple Inc. and used mainly for desktop publishing at the time, the aging 8-bit Commodore 64 which was selling for $150 by this time and became the world's best-selling computer, the 16-bit Commodore Amiga line used for television and video production and the 16-bit Atari ST used by the music industry. However, IBM itself lost the main role in the market for IBM PC compatibles by 1990. A few events in retrospect are important:
- The 1982 introduction of the Compaq Portable, the first 100% IBM PC compatible computer, providing portability unavailable from IBM at the time.
- An Independent Business Unit (IBU) within IBM developed the IBM PC and XT. IBUs did not share in corporate R&D expense. After the IBU became the Entry Systems Division it lost this benefit, greatly decreasing margins.[17]
- The availability by 1986 of sub-$1000 "Turbo XT" PC XT compatibles, including early offerings from Dell Computer, reducing demand for IBM's models.[18][19] It was possible to buy two of these "generic" systems for less than the cost of one IBM-branded PC AT, and many companies did just that.
- Compaq beating IBM to market during 1986 with Compaq Deskpro 386, the first 80386-based PC.
- IBM's 1987 introduction of the incompatible and proprietary MicroChannel Architecture (MCA) computer bus, for its Personal System/2 (PS/2) line.[15]
- The 1988 introduction by the "Gang of Nine" companies of a rival bus, Extended Industry Standard Architecture, intended to compete with, rather than copy, MCA.[15]
- The duelling Expanded memory and Extended memory standards of the late 1980s, both developed without input from IBM.
As of October 2007, Hewlett-Packard and Dell have the largest shares of the PC market in North America. They are also successful overseas, with Acer, Lenovo, and Toshiba also notable. Worldwide, a huge number of PCs are "white box" systems assembled by myriad local systems builders. Despite advances of computer technology, all current IBM PC compatibles remain very much compatible with the original IBM PC computers, although most of the components implement the compatibility in special backward compatibility modes used only during a system boot.
Expandability
One of the strengths of the PC compatible design is its modular hardware design. End-users could readily upgrade peripherals and, to some degree, processor and memory without modifying the computer's motherboard or replacing the whole computer, as was the case with many of the microcomputers of the time. However, as processor speed and memory width increased, the limits of the original XT/AT bus design were soon reached, particularly when driving graphics video cards. IBM did introduce an upgraded bus in the IBM PS/2 computer that overcame many of the technical limits of the XT/AT bus, but this was rarely used as the basis for IBM compatible computers since it required licence payments to IBM both for the PS/2 bus and any prior AT-bus designs produced by the company seeking a license. This was unpopular with hardware manufacturers and several competing bus standards were developed by consortiums, with more agreeable license terms. Various attempts to standardize the interfaces were made, but in practice, many of these attempts were either flawed or ignored. Even so, there were many expansion options, and despite the confusion of its users, the PC compatible design advanced much faster than other competing designs of the time, even if only because of its market dominance."IBM PC compatible" becomes "Wintel"
During the 1990s, IBM's influence on PC architecture started to decline. An IBM-brand PC became the exception rather than the rule. Instead of placing importance on compatibility with the IBM PC, vendors began to emphasize compatibility with Windows. In 1993, a version of Windows NT was released that could operate on processors other than the x86 set. While it required that applications be recompiled, which most developers didn't do, its hardware independence was used for Silicon Graphics (SGI) x86 workstations–thanks to NT's Hardware abstraction layer (HAL), they could operate NT (and its vast application library)[clarification needed].No mass-market personal computer hardware vendor dared to be incompatible with the latest version of Windows, and Microsoft's annual WinHEC conferences provided a setting in which Microsoft could lobby for—and in some cases dictate—the pace and direction of the hardware of the PC industry. Microsoft and Intel had become so important to the ongoing development of PC hardware that industry writers began using the portmanteau word Wintel to refer to the combined hardware-software system.
This terminology itself is becoming a misnomer, as Intel has lost absolute control over the direction of x86 hardware development with AMD's AMD64. Also, non-Windows operating systems like Mac OS X and Linux have established a presence on the x86 architecture.
Design limitations and more compatibility issues
Although the IBM PC was designed for expandability, the designers could not anticipate the hardware developments of the 1980s, nor the size of the industry they would engender. To make things worse, IBM's choice of the Intel 8088 for the CPU introduced several limitations for developing software for the PC compatible platform. For example, the 8088 processor only had a 20-bit memory addressing space. To expand PCs beyond one megabyte, Lotus, Intel, and Microsoft jointly created expanded memory (EMS), a bank-switching scheme to allow more memory provided by add-in hardware, and accessed by a set of four 16-Kilobyte "windows" inside the 20-bit addressing. Later, Intel CPUs had larger address spaces and could directly address 16- Megabytes (MiBs) (80286) or more, causing Microsoft to develop extended memory (XMS) which did not require additional hardware."Expanded" and "extended" memory have incompatible interfaces, so anyone writing software that used more than one megabyte had to provide for both systems for the greatest compatibility until MS-DOS began including EMM386, which simulated EMS memory using XMS memory. A protected mode OS can also be written for the 80286, but DOS application compatibility was more difficult than expected, not only because most DOS applications accessed the hardware directly, bypassing BIOS routines intended to ensure compatibility, but also that most BIOS requests were made by the first 32 interrupt vectors, which were marked as "reserved" for protected mode processor exceptions by Intel.
Video cards suffered from their own incompatibilities. Once video cards advanced to SVGA the standard for accessing them was no longer clear. At the time, PC programming used a memory model that had 64 KB memory segments. The most common VGA graphics mode's screen memory fit into a single memory segment. SVGA modes required more memory, so accessing the full screen memory was tricky. Each manufacturer developed their own methods of accessing the screen memory, even going so far as not to number the modes consistently. An attempt at creating a standard named VESA BIOS Extensions (VBE) was made, but not all manufacturers used it.
When the 386 was introduced, again a protected mode OS could be written for it. This time, DOS compatibility was much easier because of virtual 8086 mode. Unfortunately programs could not switch directly between them, so eventually, some new memory-model APIs were developed, VCPI and DPMI, the latter becoming the most popular.
Because of the great number of third-party adapters and no standard for them, programming the PC could be difficult. Professional developers would operate a large test-suite of various known-to-be-popular hardware combinations.
Meanwhile, consumers were overwhelmed by the competing, incompatible standards and many different combinations of hardware on offer. To give them some idea of what sort of PC they would need to operate their software, the Multimedia PC (MPC) standard was set during 1990. A PC that met the minimum MPC standard could be marketed with the MPC logo, giving consumers an easy-to-understand specification to look for. Software that could operate on the most minimally MPC-compliant PC would be guaranteed to operate on any MPC. The MPC level 2 and MPC level 3 standards were set later, but the term "MPC compliant" never became popular. After MPC level 3 during 1996, no further MPC standards were established.
Challenges to Wintel domination
By the late 1990s, the success of Microsoft Windows had driven rival commercial operating systems into near-extinction, and had ensured that the “IBM PC compatible” computer was the dominant computing platform. This meant that if a developer made their software only for the Wintel platform, they would still be able to reach the vast majority of computer users. By the late 1980s, the only major competitor to Windows with more than a few percentage points of market share was Apple Inc.'s Macintosh. The Mac started out billed as "the computer for the rest of us" but the Mac's high prices and closed architecture meant the DOS/Windows/Intel onslaught quickly drove the Macintosh into an education and desktop publishing niche, from which it has only recently begun to emerge. By the mid-1990s the Mac's market share had dwindled to around 5% and introducing a new rival operating system had become too risky a commercial venture. Experience had shown that even if an operating system was technically superior to Windows, it would be a failure in the market (BeOS and OS/2 for example). In 1989 Steve Jobs said of his new NeXT system, "It will either be the last new hardware platform to succeed, or the first to fail." Four years later in 1993 NeXT announced it was ending production of the NeXTcube and porting NeXTSTEP to Intel processors.For hardware, Intel initially licensed their technology so that other manufacturers could make x86 central processing units (CPUs). As the "Wintel" platform gained dominance Intel abandoned this practice. Companies such as AMD and Cyrix developed alternative CPUs that were functionally compatible with Intel's. Towards the end of the 1990s, AMD was taking an increasing share of the CPU market for PCs. AMD even ended up playing a significant role in directing the evolution of the x86 platform when its Athlon line of processors continued to develop the classic x86 architecture as Intel deviated with its "Netburst" architecture for the Pentium 4 CPUs and the IA-64 architecture for the Itanium set of server CPUs. AMD developed AMD64, the first major extension not created by Intel, which Intel later adopted as x86-64. During 2006 Intel began abandoning Netburst with the release of their set of "Core" processors that represented an evolution of the earlier Pentium III.
The IBM PC compatible today
The term 'IBM PC compatible' is not commonly used presently because all current mainstream desktop and laptop computers are based on the PC architecture, and IBM no longer makes PCs. The competing hardware architectures have either been discontinued or, like the Amiga, have been relegated to niche, enthusiast markets. In the past, the most successful exception was Apple Inc.'s Macintosh platform, which used non-Intel processors from its inception. Initially based on the Motorola 68000 family, then the PowerPC architecture Apple co-developed with IBM and Motorola, the Apple Intel transition was complete by 2007. Today's Macintosh computers share the same system architecture as their Wintel counterparts and can run Microsoft Windows either in a dual boot arrangement or, using third-party software, in a window alongside Mac OS X.The processor speed and memory capacity of modern PCs are many orders of magnitude greater than they were for the original IBM PC and yet backwards compatibility has been largely maintained – a 32-bit operating system published during the 2000s can still operate many of the simpler programs written for the OS of the early 1980s without needing an emulator, though an emulator like DOSBox now has near-native functionality at full speed. Additionally, many modern PCs can still run DOS directly, although special options such as USB legacy mode and SATA-to-PATA emulation may need to be set in the BIOS setup utility. Computers using the Extensible Firmware Interface might need to be set at legacy BIOS mode to be able to boot DOS. However, the BIOS/EFI options in most mass-produced consumer-grade computers are very limited and cannot be configured to truly handle OS's such as the original variants of DOS.
The spread of the x86-64 architecture has severely impacted the original IBM PC compatibility that existed for so long. While the instruction set itself is compatible in theory, virtually all 64-bit processors and operating systems in distribution have requirements that are simply too far removed from the original PC standards to ultimately remain backwards compatible.
See also
- AT (form factor)
- ATX form factor
- Baby AT form factor
- Computer hardware
- Computer software
- Computing platform
- Custom built PC
- History of computing hardware (1960s–present)
- Homebuilt computer
- IBM Personal Computer
- Influence of the IBM PC on the personal computer market
- PC speaker
- Personal computer
- x86 architecture
- MS-DOS
- CP/M
No comments:
Post a Comment