The 20th century was nearly into its fourth decade before the first electronic computer came along, and those early machines were behemoths capable of only the most basic tasks. Today, tiny "handhelds" are used for word processing and storage, delivery of documents and images, inventory management, and remote access by workers to central offices. Programmable electronic devices of all sorts have come to pervade modern society to such a degree that future generations may well designate the 20th century as the Computer Age.
1936 |
"A Symbolic Analysis of Relay and Switching Circuits" Electrical engineer and mathematician Claude Shannon, in his master’s thesis, "A Symbolic Analysis of Relay and Switching Circuits," uses Boolean algebra to establish a working model for digital circuits. This paper, as well as later research by Shannon, lays the groundwork for the future telecommunications and computer industries. |
|
1939 |
Atanasoff-Berry Computer, the first electronic computer John Atanasoff and Clifford Berry at Iowa State College design the first electronic computer. The obscure project, called the Atanasoff-Berry Computer (ABC), incorporates binary arithmetic and electronic switching. Before the computer is perfected, Atanasoff is recruited by the Naval Ordnance Laboratory and never resumes its research and development. However, in the summer of 1941, at Atanasoff’s invitation, computer pioneer John Mauchly of the University of Pennsylvania, visits Atanasoff in Iowa and sees the ABC demonstrated. |
|
1939 |
First binary digital computers are developed The first binary digital computers are developed. Bell Labs’s George Stibitz designs the Complex Number Calculator, which performs mathematical operations in binary form using on-off relays, and finds the quotient of two 8-digit numbers in 30 seconds. In Germany, Konrad Zuse develops the first programmable calculator, the Z2, using binary numbers and Boolean algebra—programmed with punched tape. |
|
1943 |
First vacuum-tube programmable logic calculator Colossus, the world’s first vacuum-tube programmable logic calculator, is built in Britain for the purpose of breaking Nazi codes. On average, Colossus deciphers a coded message in two hours. |
|
1945 |
Specifications of a stored-program computer Two mathematicians, Briton Alan Turing and Hungarian John von Neumann, work independently on the specifications of a stored-program computer. Von Neumann writes a document describing a computer on which data and programs can be stored. Turing publishes a paper on an Automatic Computing Engine, based on the principles of speed and memory. |
|
1946 |
First electronic computer put into operation The first electronic computer put into operation is developed late in World War II by John Mauchly and John Presper Eckert at the University of Pennsylvania’s Moore School of Electrical Engineering. The Electronic Numerical Integrator and Computer (ENIAC), used for ballistics computations, weighs 30 tons and includes 18,000 vacuum tubes, 6,000 switches, and 1,500 relays. |
|
1947 |
Transistor is invented
|
|
1949 |
First stored-program compute is built The Electronic Delay Storage Automatic Calculator (EDSAC), the first stored-program computer, is built and programmed by British mathematical engineer Maurice Wilkes. |
|
1951 |
First computer designed for U.S. business Eckert and Mauchly, now with their own company (later sold to Remington Rand), design UNIVAC (UNIVersal Automatic Computer)—the first computer for U.S. business. Its breakthrough feature: magnetic tape storage to replace punched cards. First developed for the Bureau of the Census to aid in census data collection, UNIVAC passes a highly public test by correctly predicting Dwight Eisenhower’s victory over Adlai Stevenson in the 1952 presidential race. But months before UNIVAC is completed, the British firm J. Lyons & Company unveils the first computer for business use, the LEO (Lyons Electronic Office), which eventually calculated the company’s weekly payroll. |
|
1952 |
First computer compiler Grace Murray Hopper, a senior mathematician at Eckert-Mauchly Computer Corporation and a programmer for Harvard’s Mark I computer, develops the first computer compiler, a program that translates computer instructions from English into machine language. She later creates Flow-Matic, the first programming language to use English words and the key influence for COBOL (Common Business Oriented Language). Attaining the rank of rear admiral in a navy career that brackets her work at Harvard and Eckert-Mauchly, Hopper eventually becomes the driving force behind many advanced automated programming technologies. |
|
1955 |
First disk drive for random-access storage of data IBM engineers led by Reynold Johnson design the first disk drive for random-access storage of data, offering more surface area for magnetization and storage than earlier drums. In later drives a protective "boundary layer" of air between the heads and the disk surface would be provided by the spinning disk itself. The Model 305 Disk Storage unit, later called the Random Access Method of Accounting and Control, is released in 1956 with a stack of fifty 24-inch aluminum disks storing 5 million bytes of data. |
|
1957 |
FORTRAN becomes commercially available FORTRAN (for FORmula TRANslation), a high-level programming language developed by an IBM team led by John Backus, becomes commercially available. FORTRAN is a way to express scientific and mathematical computations with a programming language similar to mathematical formulas. Backus and his team claim that the FORTRAN compiler produces machine code as efficient as any produced directly by a human programmer. Other programming languages quickly follow, including ALGOL, intended as a universal computer language, in 1958 and COBOL in 1959. ALGOL has a profound impact on future languages such as Simula (the first object-oriented programming language), Pascal, and C/C++. FORTRAN becomes the standard language for scientific computer applications, and COBOL is developed by the U.S. government to standardize its commercial application programs. Both dominate the computer-language world for the next 2 decades. |
|
1958 |
Integrated circuit invented Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductor independently invent the integrated circuit. (see Electronics.) |
|
1960 |
Digital Equipment Corporation introduces the "compact" PDP-1 Digital Equipment Corporation introduces the "compact" PDP-1 for the science and engineering market. Not including software or peripherals, the system costs $125,000, fits in a corner of a room, and doesn’t require air conditioning. Operated by one person, it features a cathode-ray tube display and a light pen. In 1962 at MIT a PDP-1 becomes the first computer to run a video game when Steve Russell programs it to play "Spacewar." The PDP-8, released 5 years later, is the first computer to fully use integrated circuits. |
|
1964 |
BASIC Dartmouth professors John Kemeny and Thomas Kurtz develop the BASIC (Beginners All-Purpose Symbolic Instruction Code) programming language specifically for the school's new timesharing computer system. Designed for non-computer-science students, it is easier to use than FORTRAN. Other schools and universities adopt it, and computer manufacturers begin to provide BASIC translators with their systems. |
|
1968 |
Computer mouse makes its public debut The computer mouse makes its public debut during a demonstration at a computer conference in San Francisco. Its inventor, Douglas Engelbart of the Stanford Research Institute, also demonstrates other user-friendly technologies such as hypermedia with object linking and addressing. Engelbart receives a patent for the mouse 2 years later. |
|
1970 |
Palo Alto Research Center (PARC) Xerox Corporation assembles a team of researchers in information and physical sciences in Palo Alto, California, with the goal of creating "the architecture of information." Over the next 30 years innovations emerging from the Palo Alto Research Center (PARC) include the concept of windows (1972), the first real personal computer (Alto in 1973), laser printers (1973), the concept of WYSIWYG (what you see is what you get) word processors (1974), and EtherNet (1974). In 2002 Xerox PARC incorporates as an independent company—Palo Alto Research Center, Inc. |
|
1975 |
First home computer is marketed to hobbyists The Altair 8800, widely considered the first home computer, is marketed to hobbyists by Micro Instrumentation Telemetry Systems. The build-it-yourself kit doesn’t have a keyboard, monitor, or its own programming language; data are input with a series of switches and lights. But it includes an Intel microprocessor and costs less than $400. Seizing an opportunity, fledgling entrepreneurs Bill Gates and Paul Allen propose writing a version of BASIC for the new computer. They start the project by forming a partnership called Microsoft. |
|
1977 |
Apple II is released Apple Computer, founded by electronics hobbyists Steve Jobs and Steve Wozniak, releases the Apple II, a desktop personal computer for the mass market that features a keyboard, video monitor, mouse, and random-access memory (RAM) that can be expanded by the user. Independent software manufacturers begin to create applications for it. |
|
1979 |
First laptop computer is designed What is thought to be the first laptop computer is designed by William Moggridge of GRiD Systems Corporation in England. The GRiD Compass 1109 has 340 kilobytes of bubble memory and a folding electroluminescent display screen in a magnesium case. Used by NASA in the early 1980s for its shuttle program, the "portable computer" is patented by GriD in 1982. |
|
1979 |
First commercially successful business application Harvard MBA student Daniel Bricklin and programmer Bob Frankston launch the VisiCalc spreadsheet for the Apple II, a program that helps drive sales of the personal computer and becomes its first commercially successful business application. VisiCalc owns the spreadsheet market for nearly a decade before being eclipsed by Lotus 1-2-3, a spreadsheet program designed by a former VisiCalc employee. |
|
1981 |
IBM Personal Computer released IBM introduces the IBM Personal Computer with an Intel 8088 microprocessor and an operating system—MS-DOS—designed by Microsoft. Fully equipped with 64 kilobytes of memory and a floppy disk drive, it costs under $3,000. |
|
1984 |
Macintosh is introduced Apple introduces the Macintosh, a low-cost, plug-and-play personal computer whose central processor fits on a single circuit board. Although it doesn’t offer enough power for business applications, its easy-to-use graphic interface finds fans in education and publishing. |
|
1984 |
CD-ROM introduced Philips and Sony combine efforts to introduce the CD-ROM (compact disc read-only memory), patented in 1970 by James T. Russell. With the advent of the CD, data storage and retrieval shift from magnetic to optical technology. The CD can store more than 300,000 pages worth of information—more than the capacity of 450 floppy disks—meaning it can hold digital text, video, and audio files. Advances in the 1990s allow users not only to read prerecorded CDs but also to download, write, and record information onto their own disks. |
|
1985 |
Windows 1.0 is released Microsoft releases Windows 1.0, operating system software that features a Macintosh-like graphical user interface (GUI) with drop-down menus, windows, and mouse support. Because the program runs slowly on available PCs, most users stick to MS-DOS. Higher-powered microprocessors beginning in the late 1980s make the next attempts—Windows 3.0 and Windows 95—more successful. |
|
1991 |
World Wide Web
|
|
1992 |
Personal digital assistant Apple chairman John Sculley coins the term "personal digital assistant" to refer to handheld computers. One of the first on the market is Apple’s Newton, which has a liquid crystal display operated with a stylus. The more successful Palm Pilot is released by 3Com in 1996. |
|
1999 |
Palm VII connected organizer Responding to a more mobile workforce, handheld computer technology leaps forward with the Palm VII connected organizer, the combination of a computer with 2 megabytes of RAM and a port for a wireless phone. At less than $600, the computer weighs 6.7 ounces and operates for up to 3 weeks on two AAA batteries. Later versions offer 8 megabytes of RAM, Internet connectivity, and color screens for less than $500. |
Even so, this was an authentic general-purpose digital computer, a device traditionally associated with air-conditioned sanctums and operation by a technical elite. The Altair's maker, counting on the curiosity of electronics hobbyists, hoped to sell a few hundred. Instead, orders poured in by the thousands, signaling an appetite that, by the end of the century, would put tens of millions of personal computers in homes, offices, and schools around the world. Once again, the greatest productivity tool ever invented would wildly outstrip all expectations.
When the programmable digital computer was born shortly before mid-century, there was little reason to expect that it would someday be used to write letters, keep track of supermarket inventories, run financial networks, make medical diagnoses, help design automobiles, play games, deliver e-mail and photographs across the Internet, orchestrate battles, guide humans to the moon, create special effects for movies, or teach a novice to type. In the dawn years its sole purpose was to reduce mathematical drudgery, and its value for even that role was less than compelling. One of the first of the breed was the Harvard Mark I, conceived in the late 1930s by Harvard mathematician Howard Aiken and built by IBM during World War II to solve difficult ballistics problems. The Mark I was 51 feet long and 8 feet high, had 750,000 parts and 500 miles of wiring, and was fed data in the form of punched cards—an input method used for tabulating equipment since the late 19th century. This enormous machine could do just three additions or subtractions a second.
A route to far greater speeds was at hand, however. It involved basing a computer's processes on the binary numbering system, which uses only zeros and ones instead of the 10 digits of the decimal system. In the mid- 19th century the British mathematician George Boole devised a form of algebra that encoded logic in terms of two states—true or false, yes or no, one or zero. If expressed that way, practically any mathematical or logical problem could be solved by just three basic operations, dubbed "and," "or," and "not." During the late 1930s several researchers realized that Boole's operations could be given physical form as arrangements of switches—a switch being a two-state device, on or off. Claude Shannon, a mathematician and engineer at the Massachusetts Institute of Technology (MIT), spelled this out in a masterful paper in 1938. At about the time Shannon was working on his paper, George Stibitz of AT&T's Bell Laboratories built such a device, using strips of tin can, flashlight bulbs, and surplus relays. The K-Model, as Stibitz called it (for kitchen table), could add two bits and display the result. In 1939, John Atanasoff, a physicist at Iowa State College, also constructed a rudimentary binary machine, and unknown to them all, a German engineer named Konrad Zuse created a fully functional general-purpose binary computer (the Z3) in 1941, only to see further progress thwarted by Hitler's lack of interest in long-term scientific research.
The switches used in most early computers were electromechanical relays, developed for the telephone system, but they soon gave way to vacuum tubes, which could turn an electric current on or off much more quickly. The first large-scale, all-electronic computer, ENIAC, took shape late in the war at the University of Pennsylvania's Moore School of Electrical Engineering under the guidance of John Mauchly and John Presper Eckert. Like the Mark I, it was huge—30 tons, 150 feet wide, with 20 banks of flashing lights—and it too was intended for ballistics calculations, but ENIAC could process numbers a thousand times faster. Even before it was finished, Mauchly and Eckert were making plans for a successor machine called EDVAC, conceived with versatility in mind.
Although previous computers could shift from one sort of job to another if given new instructions, this was a tedious process that might involve adjusting hundreds of controls or unplugging and replugging a forest of wires. EDVAC, by contrast, was designed to receive its instructions electronically; moreover, the program, coded in zeros and ones, would be kept in the same place that held the numbers the computer would be processing. This approach—letting a program treat its own instructions as data—offered huge advantages. It would accelerate the work of the computer, simplify its circuitry, and make possible much more ambitious programming. The stored-program idea spread rapidly, gaining impetus from a lucid description by one of the most famous mathematicians in the world, John von Neumann, who had taken an interest in EDVAC.
Building such a machine posed considerable engineering challenges, and EDVAC would not be the first to clear the hurdles. That honor was claimed in the spring of 1949 by a 3,000-tube stored-program computer dubbed EDSAC, the creation of British mathematical engineer Maurice Wilkes, of Cambridge University.
Meanwhile, Eckert and Mauchly had left the Moore School and established a company to push computing into the realm of commerce. The product they envisioned, a 5,000-tube machine called UNIVAC, had a breakthrough feature—storing data on magnetic tape rather than by such unwieldy methods as punched cards. Although a few corporate customers were lined up in advance, development costs ran so high that the two men had to sell their company to the big office equipment maker Remington Rand. Their design proved a marketplace winner, however. Completed in 1951, UNIVAC was rugged, reliable, and able to perform almost 2,000 calculations per second. Its powers were put to a highly public test during the 1952 presidential election, when CBS gave UNIVAC the job of forecasting the outcome from partial voting returns. Early in the evening the computer (represented by a fake bank of blinking lights in the CBS studio) projected a landslide victory by Dwight Eisenhower over Adlai Stevenson. The prediction was made in such unequivocal terms that UNIVAC's operators grew nervous and altered the program to produce a closer result. They later confessed that the initial projection of electoral votes had been right on the mark.
By then several dozen other companies had jumped into the field. The most formidable was International Business Machines (IBM), a leading supplier of office equipment since early in the century. With its deep knowledge of corporate needs and its peerless sales force, IBM soon eclipsed all rivals. Other computer makers often expected customers to write their own applications programs, but IBM was happy to supply software for invoicing, payroll, production forecasts, and other standard corporate tasks. In time the company created extensive suites of software for such business sectors as banking, retailing, and insurance. Most competitors lacked the resources and revenue to keep pace.
Some of the computer projects taken on by IBM were gargantuan in scope. During the 1950s the company had as many as 8,000 employees laboring to computerize the U.S. air defense system. The project, known as SAGE and based on developmental work done at MIT's Lincoln Laboratory, called for a network of 23 powerful computers to process radar information from ships, planes, and ground stations while also analyzing weather, tracking weapons availability, and monitoring a variety of other matters. Each computer had 49,000 tubes and weighed 240 tons—the biggest ever built.
Almost as complex was an airline reservation system, called SABRE, that IBM created for American Airlines in the late 1950s and early 1960s. Using a million lines of program code and two big computers, it linked agents in 50 cities and could handle millions of transactions a year, processing them at the rate of one every 3 seconds. But writing the software for SAGE or SABRE was child's play compared to what IBM went through in the 1960s when it decided to overhaul its increasingly fragmented product line and make future machines compatible—alike in how they read programs, processed data, and dealt with input and output devices. Compatibility required an all-purpose operating system, the software that manages a computer's basic procedures, and it had to be written from scratch. That job took about 5,000 person-years of work and roughly half a billion dollars, but the money was well spent. The new product line, known as System/360, was a smash hit, in good part because it gave customers unprecedented freedom in mixing and matching equipment.
By the early 1970s technology was racing to keep up with the thirst for electronic brainpower in corporations, universities, government agencies, and other such big traffickers in data. Vacuum-tube switches had given way a decade earlier to smaller, cooler, less power-hungry transistors, and now the transistors, along with other electronic components, were being packed together in ever-increasing numbers on silicon chips. In addition to their processing roles, these chips were becoming the technology of choice for memory, the staging area where data and instructions are shuttled in and out of the computer—a job long done by arrays of tiny ferrite doughnuts that registered data magnetically. Storage—the part of a computing system where programs and data are kept in readiness-had gone through punched card, magnetic tape, and magnetic drum phases; now high-speed magnetic disks ruled. High-level programming languages such as FORTRAN (for science applications), COBOL (for business), and BASIC (for beginners) allowed software to be written in English-like commands rather than the abstruse codes of the early days.
Some computer makers specialized in selling prodigiously powerful machines to such customers as nuclear research facilities or aerospace manufacturers. A category called supercomputers was pioneered in the mid-1960s by Control Data Corporation, whose chief engineer, Seymour Cray, designed the CDC 6600, a 350,000-transistor machine that could execute 3 million instructions per second. The price: $6 million. At the opposite end of the scale, below big mainframe machines like those made by IBM, were minicomputers, swift enough for many scientific or engineering applications but at a cost of tens of thousands rather than hundreds of thousands of dollars. Their development was spearheaded by Kenneth Olsen, an electrical engineer who cofounded Digital Equipment Corporation and had close ties to MIT.
Then, with the arrival of the humble Altair in 1975, the scale suddenly plunged to a level never imagined by industry leaders. What made such a compact, affordable machine possible was the microprocessor, which concentrated all of a computer's arithmetical and logical functions on a single chip—a feat first achieved by an engineer named Ted Hoff at Intel Corporation in 1971. After the Intel 8080 microprocessor was chosen for the Altair, two young computer buffs from Seattle, Bill Gates and Paul Allen, won the job of writing software that would allow it to be programmed in BASIC. By the end of the century the company they formed for that project, Microsoft, had annual sales greater than many national economies.
Nowhere was interest in personal computing more intense than in the vicinity of Palo Alto, California, a place known as Silicon Valley because of the presence of many big semiconductor firms. Electronics hobbyists abounded there, and two of them—Steve Jobs and Steve Wozniak—turned their tinkering into a highly appealing consumer product: the Apple II, a plastic-encased computer with a keyboard, screen, and cassette tape for storage. It arrived on the market in 1977, described in its advertising copy as "the home computer that's ready to work, play, and grow with you." Few packaged programs were available at first, but they soon arrived from many quarters. Among them were three kinds of applications that made this desktop device a truly valuable tool for business—word processing, spreadsheets, and databases. The market for personal computers exploded, especially after IBM weighed in with a product in 1981. Its offering used an operating system from Microsoft, MS-DOS, which was quickly adopted by other manufacturers, allowing any given program to run on a wide variety of machines.
The next 2 decades saw computer technology rocketing ahead on every front. Chips doubled in density almost annually, while memory and storage expanded by leaps and bounds. Hardware like the mouse made the computer easier to control; operating systems allowed the screen to be divided into independently managed windows; applications programs steadily widened the range of what computers could do; and processors were lashed together—thousands of them in some cases-in order to solve pieces of a problem in parallel. Meanwhile, new communications standards enabled computers to be joined in private networks or the incomprehensibly intricate global weave of the Internet.
Where it all will lead is unknowable, but the rate of advance is almost certain to be breathtaking. When the Mark I went to work calculating ballistics tables back in 1943, it was described as a "robot superbrain" because of its ability to multiply a pair of 23-digit numbers in 3 seconds. Today, some of its descendants need just 1 second to perform several hundred trillion mathematical operations—a performance that, in a few years, will no doubt seem slow.
William H. Gates III
Chairman and Chief Software Architect
Microsoft Corporation
For me the personal computer revolution started in the mid-1970s, when my friend Paul Allen and I saw a magazine article about the MITS Altair 8800. The Altair was the first build-it-yourself computer kit for hobbyists. For a few hundred dollars, MITS would mail you a few bags of parts and some photocopied instructions. After some careful soldering, you had your own computer, roughly the size of a bread box, with rows of switches and blinking lights.
It wasn't much to look at and it wasn't terribly useful, but it felt like the start of a revolution. Until then computers were used mostly by technicians in air-conditioned rooms. Few people had the opportunity even to see a computer and even fewer got to use one. But the Altair was a computer that people could put on their desks, and what they could do with it was limited only by their imagination—and the modest capabilities of Intel's 8080 microprocessor.
We knew that microprocessors would become cheaper and more powerful, making personal computers increasingly capable. We also knew those computers would need software to make them do useful things. So Paul and I founded a company we called Microsoft that we hoped would meet this need.
Our first product was a version of the BASIC programming language that could run on the Altair. Unlike many other languages available at the time, BASIC was relatively simple to use. After a few minutes of instruction, even a nontechnical person could start writing simple programs. Actually developing this product, however, was not very simple. First, it was challenging to come up with a BASIC that could run in the Altair's limited memory and still leave room for people to write programs. Second, we didn't have an Altair to work with. Only a few prototypes were available at the time.
After writing software that would mimic the Altair's functions on another computer and spending nearly all our spare time writing code—some of it on paper notepads-we managed to create a BASIC that worked. For its time the Altair was a huge success, and thousands of programmers used our software to make it do interesting and useful things. Since then the PC has evolved from a hobbyist's toy into a powerful tool that has transformed how we work, learn, play, and keep in touch. And it has created an industry that employs millions of people and plays a leading role in our global economy.
Computing has made many evolutionary leaps over the decades-from the command line to the graphical user interface, from stand-alone PCs to a globally connected Internet. But we're now seeing an even more fundamental change. We're in what I call the "digital decade," a time when computers are moving beyond being merely useful to becoming an essential part of our everyday lives. Today we use computers for discrete tasks—like doing e-mail and paying bills—but in the years ahead they'll play a key role in almost everything we do. We'll rely on them to run our lives and businesses. We'll want them to keep us informed and entertained. We'll expect them to be wherever we need them. It will be an era of truly personal computing.
Many of our early dreams for the PC have already come true. They can recognize speech and handwriting, create realistic animation, and enable people to collaborate, communicate, and find information around the world. But we've barely scratched the surface of the PC's potential, and I'm incredibly excited about the amazing innovations that are just over the horizon.