Computer History
Computer History
Computer History
1947
Computer pioneers Presper Eckert and
John Mauchly founded the Eckert-
Mauchly Computer Corp. to construct
machines based on their experience
with ENIAC and EDVAC. The only
machine the company built was
BINAC. Before completing the
UNIVAC, the company became a
division of Remington Rand.
Components
1947
The Williams tube won the race for a practical random-access memory.
Sir Frederick Williams of Manchester University modified a cathode-ray
tube to paint dots and dashes of phosphorescent electrical charge on the
screen, representing binary ones and zeros. Vacuum tube machines, such
as the IBM 701, used the Williams tube as primary memory.
Williams tube
On December 23, William Shockley, Walter Brattain, and John Bardeen
successfully tested this point-contact transistor, setting off the
semiconductor revolution. Improved models of the transistor, developed
at AT&T Bell Laboratories, supplanted vacuum tubes used on computers
at the time.
Point-contact transistor
1953
At MIT, Jay Forrester installed magnetic core memory on the Whirlwind
computer. Core memory made computers more reliable, faster, and
easier to make. Such a system of storage remained popular until the
development of semiconductors in the 1970s.
Core memory
1954
A silicon-based junction transistor, perfected by Gordon Teal of Texas
Instruments Inc., brought the price of this component down to $2.50. A
Texas Instruments news release from May 10, 1954, read, "Electronic
"brains" approaching the human brain in scope and reliability came much
closer to reality today with the announcement by Texas Instruments
Incorporated of the first commercial production of silicon transistors
kernel-sized substitutes for vacuum tubes."
The company became a household name when the first transistor radio
incorporated Teal´s invention. The radio, sold by Regency Electronics for
$50, launched the world into a global village of instant news and pop
music.
1955
Felker and Harris program TRADIC, AT&T Bell Laboratories announced
the first fully transistorized computer, TRADIC. It contained nearly 800
transistors instead of vacuum tubes. Transistors — completely cold,
highly efficient amplifying devices invented at Bell Labs — enabled the
machine to operate on fewer than 100 watts, or one-twentieth the power
required by comparable vacuum tube computers.
1958
Jack Kilby created the first integrated circuit at Texas Instruments to
prove that resistors and capacitors could exist on the same piece of
semiconductor material. His circuit consisted of a sliver of germanium
with five components linked by wires.
1959
Jean Hoerni's Planar process, invented at Fairchild Camera and
Instrument Corp., protects transistor junctions with a layer of oxide. This
improves reliability and, by allowing printing of conducting channels
directly on the silicon surface, enabled Robert Noyce's invention of the
monolithic integrated circuit.
1961
Fairchild Camera and Instrument Corp. invented the resistor-transistor
logic (RTL) product, a set/reset flip-flop and the first integrated circuit
available as a monolithic chip.
1962
Fairchild Camera and Instrument Corp. produced the first widely accepted
epitaxial gold-doped NPN transistor. The NPN transistor served as the
industry workhouse for discrete logic.
1967
Fairchild Camera and Instrument Corp. built the first standard metal
oxide semiconductor product for data processing applications, an eight-
bit arithmetic unit and accumulator. In a MOS chip, engineers treat the
semiconductor material to produce either of two varieties of transistors,
called n-type and p-type.
MOS semiconductor
Intel 4004
1972
Intel´s 8008 microprocessor made its debut. A vast improvement over its
predecessor, the 4004, its eight-bit word afforded 256 unique
arrangements of ones and zeros. For the first time, a microprocessor
could handle both uppercase and lowercase letters, all 10 numerals,
punctuation marks, and a host of other symbols.
Intel 8008
1976
Intel and Zilog introduced new microprocessors. Five times faster than its
predecessor, the 8008, the Intel 8080 could address four times as many
bytes for a total of 64 kilobytes. The Zilog Z-80 could run any program
written for the 8080 and included twice as many built-in machine
instructions.
Zilog Z-80
1979
The Motorola 68000 microprocessor exhibited a processing speed far
greater than its contemporaries. This high performance processor found
its place in powerful work stations intended for graphics-intensive
programs common in engineering.
Motorola 68000
1986
David Miller of AT&T Bell Labs patented the optical transistor, a
component central to digital optical computing. Called Self-ElectroOptic-
Effect Device, or SEED, the transistor involved a light-sensitive switch
built with layers of gallium arsenide and gallium aluminum arsenide.
Beams of light triggered electronic events that caused the light either to
be transmitted or absorbed, thus turning the switch on or off.
Compaq
1987
Motorola unveiled the 68030 microprocessor. A step up from the 68020,
it built on a 32-bit enhanced microprocessor with a central processing
unit core, a data cache, an instruction cache, an enhanced bus controller,
and a memory management unit in a single VLSI device — all operating
at speeds of at least 20 MHz.
Motorola 68030
1988
Compaq and other PC-clone makers developed enhanced industry
standard architecture — better than microchannel and retained
compatibility with existing machines. EISA used a 32-bit bus, or a means
by which two devices can communicate. The advanced data-handling
features of the EISA made it an improvement over the 16-bit bus of
industry standard architecture. IBM´s competitors developed the EISA as
a way to avoid paying a fee to IBM for its MCA bus.
1989
Intel released the 80486 microprocessor and the i860 RISC/coprocessor
chip, each of which contained more than 1 million transistors. The RISC
microprocessor had a 32-bit integer arithmetic and logic unit (the part of
the CPU that performs operations such as addition and subtraction), a
64-bit floating-point unit, and a clock rate of 33 MHz.
Intel 80486
Motorola announced the 68040 microprocessor, with about 1.2 million
transistors. Due to technical difficulties, it didn´t ship until 1991,
although promised in January 1990. A 32-bit, 25-MHz microprocessor,
the 68040 integrated a floating-point unit and included instruction and
data caches. Apple used the third generation of 68000 chips in Macintosh
Quadra computers.
Motorola 68040
1993
The Pentium microprocessor is released. The Pentium was the fifth
generation of the ‘x86’ line of microprocessors from Intel, the basis for
the IBM PC and its clones. The Pentium introduced several advances that
made programs run faster such as the ability to execute several
instructions at the same time and support for graphics and music.
Computer
s
1939
Hewlett-Packard is Founded. David Packard and Bill Hewlett found
Hewlett-Packard in a Palo Alto, California garage. Their first product was
the HP 200A Audio Oscillator, which rapidly becomes a popular piece of
test equipment for engineers. Walt Disney Pictures ordered eight of the
200B model to use as sound effects generators for the 1940 movie
“Fantasia.”
1940
The Complex Number Calculator (CNC) is completed. In 1939, Bell
Telephone Laboratories completed this calculator, designed by researcher
George Stibitz. In 1940, Stibitz demonstrated the CNC at an American
Mathematical Society conference held at Dartmouth College. Stibitz
stunned the group by performing calculations remotely on the CNC
(located in New York City) using a Teletype connected via special
telephone lines. This is considered to be the first demonstration of remote
access computing.
Companies
'39 '40 '41 '42 '43 '44 '45 '46 '47 '48
Components
'49 '50 '51 '52 '53 '54 '55 '56 '57 '58 Computers
Graphics & Games
'59 '60 '61 '62 '63 '64 '65 '66 '67 '68
Networking
'69 '70 '71 '72 '73 '74 '75 '76 '77 '78 People & Pop Culture
Robots & Artificial
'79 '80 '81 '82 '83 '84 '85 '86 '87 '88 Intelligence
Software & Languages
'89 '90 '91 '92 '93 '94
Storage
Graphics
&
Games
1963
DAC-1 computer aided design program is released. In 1959, the General
Motors Research Laboratories appointed a special research team to
investigate the use of computers in designing automobiles. In 1960, IBM
joined the project, producing the first commercially-available Computer
Aided Design program, known as DAC-1. Out of that project came the
IBM 2250 display terminal as well as many advances in computer
timesharing and the use of a single processor by two or more terminals.
1972
Pong is released. In 1966, Ralph Baer designed a ping-pong game for his
Odyssey gaming console. Nolan Bushnell played this game at a Magnavox
product show in Burlingame, California. Bushnell hired young engineer Al
Alcorn to design a car driving game, but when it became apparent that
this was too ambitious for the time, he had Alcorn to design a version of
ping-pong instead. The game was tested in bars in Grass Valley and
Sunnyvale, California where it proved very popular. Pong would
revolutionize the arcade industry and launch the modern video game era.
1977
Atari launches the Video Computer System game console. Atari released
the Atari Video Computer System (VCS) later renamed the Atari 2600.
The VCS was the first widely successful video game system, selling more
than twenty million units throughout the 1980s. The VCS used the 8-bit
MOS 6507 microprocessor and was designed to be connected to a home
television set. When the last of Atari’s 8-bit game consoles were made in
1990, more than 900 video game titles had been released.
1986
Pixar is founded. Pixar was originally called the Special Effects Computer
Group at Lucasfilm (launched in 1979). The group created the computer
animated segments of films such as “Star Trek II: The Wrath of Khan”
and “Young Sherlock Holmes.” In 1986, Apple Computer co-founder
Steve Jobs paid 10 million dollars to Lucasfilm to purchase the Group and
renamed it Pixar. Over the next decade, Pixar made highly-successful
(and Oscar-winning) animated films. It was bought by Disney in 2006.
Pixar Headquarters
1989
The concept of virtual reality made a statement as the hot topic at
Siggraph´s 1989 convention in Boston. The Silicon Graphics booth
featured the new technology, designed by the computer-aided design
software company Autodesk and the computer company VPL. The term
describes a computer-generated 3-D environment that allows a user to
interact with the realities created there. The computer must calculate and
display sensory information quickly enough to fool the senses.
1990
Video Toaster is introduced by NewTek. The Video Toaster was a video
editing and production system for the Amiga line of computers and
included custom hardware and special software. Much more affordable
than any other computer-based video editing system, the Video Toaster
was not only for home use. It was popular with public access stations and
was even good enough to be used for broadcast television shows like
Home Improvement.
1992
“Terminator 2: Judgment Day” opens. Director James Cameron’s sequel
to his 1984 hit “The Terminator,” featured ground-breaking special
effects done by Industrial Light & Magic. Made for a record $100 million,
it was the most expensive movie ever made at the time. Most of this cost
was due to the expense of computer-generated special effects (such as
image morphing) throughout the film. Terminator 2 is one of many films
that critique civilization’s frequent blind trust in technology.
Original Movie Poster for
Terminator 2: Judgment Day
1993
“Doom” is released. id Software released Doom in late 1993. An
immersive first-person shooter-style game, Doom became popular on
many different platforms before losing popularity to games like Halo and
Counter-Strike. Doom players were also among the first to customize the
game’s levels and appearance. Doom would spawn several sequels and a
2005 film.
T
i
m
e
li
n
e
o
f
C
o
m
p
u
t
e
r
H
i
s
t
o
r
y
About the Museum | Exhibits | Collections | What's Happening | Giving | About This Site
Privacy | Copyright | Feedback | Credits | Advanced Search | Site Map
Networking
1960
AT&T designed its Dataphone, the first commercial modem, specifically
for converting digital computer data to analog signals for transmission
across its long distance network. Outside manufacturers incorporated Bell
Laboratories´ digital data sets into commercial products. The
development of equalization techniques and bandwidth-conserving
modulation systems improved transmission efficiency in national and
global systems.
AT&T Dataphone
1964
Online transaction processing made its debut in IBM´s SABRE reservation
system, set up for American Airlines. Using telephone lines, SABRE linked
2,000 terminals in 65 cities to a pair of IBM 7090 computers, delivering
data on any flight in less than three seconds.
JOSS (Johnniac Open Shop System) conversational time-sharing service
began on Rand´s Johnniac. Time-sharing arose, in part, because the
length of batch turn-around times impeded the solution of problems.
Time sharing aimed to bring the user back into "contact" with the
machine for online debugging and program development.
JOSS configuration
1966
John van Geen of the Stanford Research Institute vastly improved the
acoustically coupled modem. His receiver reliably detected bits of data
despite background noise heard over long-distance phone lines. Inventors
developed the acoustically coupled modem to connect computers to the
telephone network by means of the standard telephone handset of the
day.
1970
Citizens and Southern National Bank in Valdosta, Ga., installed the
country´s first automatic teller machine.
ARPANET topology
1971
The first e-mail is sent. Ray Tomlinson of the research firm Bolt, Beranek
and Newman sent the first e-mail when he was supposed to be working
on a different project. Tomlinson, who is credited with being the one to
decide on the "@" sign for use in e-mail, sent his message over a military
network called ARPANET. When asked to describe the contents of the first
email, Tomlinson said it was “something like "QWERTYUIOP"”
1972
Wozniak´s "blue box", Steve Wozniak built his "blue box" a tone
generator to make free phone calls. Wozniak sold the boxes in
dormitories at the University of California Berkeley where he studied as
an undergraduate. "The early boxes had a safety feature — a reed switch
inside the housing operated by a magnet taped onto the outside of the
box," Wozniak remembered. "If apprehended, you removed the magnet,
whereupon it would generate off-frequency tones and be inoperable ...
and you tell the police: It´s just a music box."
1973
Robert Metcalfe devised the Ethernet method of network connection at
the Xerox Palo Alto Research Center. He wrote: "On May 22, 1973, using
my Selectric typewriter ... I wrote ... "Ether Acquisition" ... heavy with
handwritten annotations — one of which was "ETHER!" — and with hand-
drawn diagrams — one of which showed `boosters´ interconnecting
branched cable, telephone, and ratio ethers in what we now call an
internet.... If Ethernet was invented in any one memo, by any one
person, or on any one day, this was it."
Ethernet
1975
Telenet, the first commercial packet-switching network and civilian
equivalent of ARPANET, was born. The brainchild of Larry Roberts, Telenet
linked customers in seven cities. Telenet represented the first value-
added network, or VAN — so named because of the extras it offered
beyond the basic service of linking computers.
1976
The Queen of England sends first her e-mail. Elizabeth II, Queen of the
United Kingdom, sends out an e-mail on March 26 from the Royal Signals
and Radar Establishment (RSRE) in Malvern as a part of a demonstration
of networking technology.
1979
John Shoch and Jon Hupp at the Xerox Palo Alto Research Center
discover the computer "worm," a short program that searches a network
for idle processors. Initially designed to provide more efficient use of
computers and for testing, the worm had the unintended effect of
invading networked computers, creating a security threat.
Shoch took the term "worm" from the book "The Shockwave Rider," by
John Brunner, in which an omnipotent "tapeworm" program runs loose
through a network of computers. Brunner wrote: "No, Mr. Sullivan, we
can´t stop it! There´s never been a worm with that tough a head or that
long a tail! It´s building itself, don´t you understand? Already it´s
passed a billion bits and it´s still growing. It´s the exact inverse of a
phage — whatever it takes in, it adds to itself instead of wiping... Yes,
sir! I´m quite aware that a worm of that type is theoretically impossible!
But the fact stands, he´s done it, and now it´s so goddamn
comprehensive that it can´t be killed. Not short of demolishing the net!"
(247, Ballantine Books, 1975).
The first Multi-User Domain (or Dungeon), MUD1, is goes on-line. Richard
Bartle and Roy Trubshaw, two students at the University of Essex, write a
program that allows many people to play against each other on-line.
MUDs become popular with college students as a means of adventure
gaming and for socializing. By 1984, there are more than 100 active
MUDs and variants around the world.
1983
The ARPANET splits into the ARPANET and MILNET. Due to the success of
the ARPANET as a way for researchers in universities and the military to
collaborate, it was split into military (MILNET) and civilian (ARPANET)
segments. This was made possible by the adoption of TCP/IP, a
networking standard, three years earlier. The ARPANET was renamed the
“Internet” in 1995.
1985
The modern Internet gained support when the National Science
foundation formed the NSFNET, linking five supercomputer centers at
Princeton University, Pittsburgh, University of California at San Diego,
University of Illinois at Urbana-Champaign, and Cornell University. Soon,
several regional networks developed; eventually, the government
reassigned pieces of the ARPANET to the NSFNET. The NSF allowed
commercial use of the Internet for the first time in 1991, and in 1995, it
decommissioned the backbone, leaving the Internet a self-supporting
industry.
The Whole Earth 'Lectronic Link (WELL) is founded. Stewart Brand and
Larry Brilliant started an on-line Bulletin Board System (BBS) to build a
“virtual community” of computer users at low cost. Journalists were given
free memberships in the early days, leading to many articles about it and
helping it grow to thousands of members around the world.
1988
Robert Morris´ worm flooded the ARPANET. Then-23-year-old Morris, the
son of a computer security expert for the National Security Agency, sent
a nondestructive worm through the Internet, causing problems for about
6,000 of the 60,000 hosts linked to the network. A researcher at
Lawrence Livermore National Laboratory in California discovered the
worm. "It was like the Sorcerer´s Apprentice," Dennis Maxwell, then a
vice president of SRI, told the Sydney (Australia) Sunday Telegraph at
the time. Morris was sentenced to three years of probation, 400 hours of
community service, and a fine of $10,050.
Morris, who said he was motivated by boredom, programmed the worm
to reproduce itself and computer files and to filter through all the
networked computers. The size of the reproduced files eventually became
large enough to fill the computers´ memories, disabling them.
ARPANET worm
1990
The World Wide Web was born when Tim Berners-Lee, a researcher at
CERN, the high-energy physics laboratory in Geneva, developed
HyperText Markup Language. HTML, as it is commonly known, allowed
the Internet to expand into the World Wide Web, using specifications he
developed such as URL (Uniform Resource Locator) and HTTP
(HyperText Transfer Protocol). A browser, such as Netscape or Microsoft
Internet Explorer, follows links and sends a query to a server, allowing a
user to view a site.
Berners-Lee proposal
1993
The Mosaic web browser is released. Mosaic was the first commercial
software that allowed graphical access to content on the internet.
Designed by Eric Bina and Marc Andreessen at the University of Illinois’s
National Center for Supercomputer Applications, Mosaic was originally
designed for a Unix system running X-windows. By 1994, Mosaic was
available for several other operating systems such as the Mac OS,
Windows and AmigaOS.
T
i
m
e
li
n
e
o
f
C
o
m
p
u
t
e
r
H
i
s
t
o
r
y
About the Museum | Exhibits | Collections | What's Happening | Giving | About This Site
Privacy | Copyright | Feedback | Credits | Advanced Search | Site Map
People
& Pop
Culture
1945
On September 9th, Grace Hopper recorded the first actual computer
"bug" — a moth stuck between the relays and logged at 15:45 hours on
the Harvard Mark II. Hopper, a rear admiral in the U.S. Navy, enjoyed
successful careers in academia, business, and the military while making
history in the computer field. She helped program the Harvard Mark I and
II and developed the first compiler, A-0. Her subsequent work on
programming languages led to COBOL, a language specified to operate
on machines of different manufacturers.
Grace Hopper
1949
Thomas Watson Jr., speaking to an IBM sales meeting, predicted all
moving parts in machines would be replaced by electronics within a
decade.
1952
On election night, November 4, CBS News borrowed a UNIVAC to make a
scientific prediction of the outcome of the race for the presidency
between Dwight D. Eisenhower and Adlai Stevenson. The opinion polls
predicted a landslide in favor of Stevenson, but the UNIVAC´s analysis of
early returns showed a clear victory for Eisenhower. Its sharp divergence
from public opinion made newscasters Walter Cronkite and Charles
Collingwood question the validity of the computer´s forecast, so they
postponed announcing UNIVAC´s prediction until very late.
1954
Alan Turing was found dead at age 42. He had published his seminal
paper, "On Computable Numbers," in 1936, as well as posing significant
questions about judging "human intelligence" and programming and
working on the design of several computers during the course of his
career.
Alan Turing
1955
First meeting of SHARE, the IBM users group, convened. User groups
became a significant educational force allowing companies to
communicate innovations and users to trade information.
1970
Vietnam War protesters attacked university computer centers. At the
University of Wisconsin, the toll was one human and four machines.
1982
Time magazine altered its annual tradition of naming a "Man of the Year,"
choosing instead to name the computer its "Machine of the Year." In
introducing the theme, Time publisher John A. Meyers wrote, "Several
human candidates might have represented 1982, but none symbolized
the past year more richly, or will be viewed by history as more significant,
than a machine: the computer.
1984
In his novel "Neuromancer," William Gibson coined the term
"cyberspace." He also spawned a genre of fiction known as "cyberpunk"
in his book, which described a dark, complex future filled with intelligent
machines, computer viruses, and paranoia.
Gibson´s Neuromancer
1988
Pixar´s "Tin Toy" became the first computer-animated film to win an
Academy Award, taking the Oscar for best animated short film. A wind-up
toy first encountering a boisterous baby narrated "Tin Toy." To illustrate
the baby´s facial expressions, programmers defined more than 40 facial
muscles on the computer controlled by the animator.
T
i
m
e
li
n
e
o
f
C
o
m
p
u
t
e
r
H
i
s
t
o
r
y
About the Museum | Exhibits | Collections | What's Happening | Giving | About This Site
Privacy | Copyright | Feedback | Credits | Advanced Search | Site Map
Companies
'39 '40 '41 '42 '43 '44 '45 '46 '47 '48
Components
'49 '50 '51 '52 '53 '54 '55 '56 '57 '58 Computers
Graphics & Games
'59 '60 '61 '62 '63 '64 '65 '66 '67 '68
Networking
'69 '70 '71 '72 '73 '74 '75 '76 '77 '78 People & Pop Culture
Robots & Artificial
'79 '80 '81 '82 '83 '84 '85 '86 '87 '88 Intelligence
Software & Languages
'89 '90 '91 '92 '93 '94
Storage
Robots &
Artificial
Intelligence
1948
Norbert Wiener published "Cybernetics," a major influence on later
research into artificial intelligence. He drew on his World War II
experiments with anti-aircraft systems that anticipated the course of
enemy planes by interpreting radar images. Wiener coined the term
"cybernetics" from the Greek word for "steersman."
Norbert Wiener
1959
MIT´s Servomechanisms Laboratory demonstrated computer-assisted
manufacturing. The school´s Automatically Programmed Tools project
created a language, APT, used to instruct milling machine operations. At
the demonstration, the machine produced an ashtray for each attendee.
APT ashtray
1961
UNIMATE, the first industrial robot, began work at General Motors.
Obeying step-by-step commands stored on a magnetic drum, the 4,000-
pound arm sequenced and stacked hot pieces of die-cast metal.
UNIMATE
1963
Researchers designed the Rancho Arm at Rancho Los Amigos Hospital in
Downey, California as a tool for the handicapped. The Rancho Arm´s six
joints gave it the flexibility of a human arm. Acquired by Stanford
University in 1963, it holds a place among the first artificial robotic arms
to be controlled by a computer.
Rancho Arm
1965
A Stanford team led by Ed Feigenbaum created DENDRAL, the first expert
system, or program designed to execute the accumulated expertise of
specialists. DENDRAL applied a battery of "if-then" rules in chemistry and
physics to identify the molecular structure of organic compounds.
1968
Marvin Minsky developed the Tentacle Arm, which moved like an octopus.
It had twelve joints designed to reach around obstacles. A PDP-6
computer controlled the arm, powered by hydraulic fluids. Mounted on a
wall, it could lift the weight of a person.
Tentacle Arm
1969
Victor Scheinman´s Stanford Arm made a breakthrough as the first
successful electrically powered, computer-controlled robot arm. By 1974,
the Stanford Arm could assemble a Ford Model T water pump, guiding
itself with optical and contact sensors. The Stanford Arm led directly to
commercial production. Scheinman went on to design the PUMA series of
industrial robots for Unimation, robots used for automobile assembly and
other industrial tasks.
Stanford Arm
1970
SRI International´s Shakey became the first mobile robot controlled by
artificial intelligence. Equipped with sensing devices and driven by a
problem-solving program called STRIPS, the robot found its way around
the halls of SRI by applying information about its environment to a route.
Shakey used a TV camera, laser range finder, and bump sensors to
collect data, which it then transmitted to a DEC PDP-10 and PDP-15. The
computer radioed back commands to Shakey — who then moved at a
speed of 2 meters per hour.
SRI´s Shakey
1974
David Silver at MIT designed the Silver Arm, a robotic arm to do small-
parts assembly using feedback from delicate touch and pressure sensors.
The arm´s fine movements corresponded to those of human fingers.
Silver Arm
1976
Shigeo Hirose´s Soft Gripper could conform to the shape of a grasped
object, such as this wine glass filled with flowers. The design Hirose
created at the Tokyo Institute of Technology grew from his studies of
flexible structures in nature, such as elephant trunks and snake spinal
cords.
1978
Texas Instruments Inc. introduced Speak & Spell, a talking learning aid
for ages 7 and up. Its debut marked the first electronic duplication of the
human vocal tract on a single chip of silicon. Speak & Spell utilized linear
predictive coding to formulate a mathematical model of the human vocal
tract and predict a speech sample based on previous input. It
transformed digital information processed through a filter into synthetic
speech and could store more than 100 seconds of linguistic sounds.
Shown here are the four individuals who began the Speak & Spell
program: From left to right, Gene Frantz, Richard Wiggins, Paul
Breedlove, and George Brantingham.
Stanford Cart
1983
The Musical Instrument Digital Interface was introduced at the first North
American Music Manufacturers show in Los Angeles. MIDI is an industry-
standard electronic interface that links electronic music synthesizers. The
MIDI information tells a synthesizer when to start and stop playing a
specific note, what sound that note should have, how loud it should be,
and other information.
MIDI
T
i
m
e
li
n
e
o
f
C
o
m
p
u
t
e
r
H
i
s
t
o
r
y
About the Museum | Exhibits | Collections | What's Happening | Giving | About This Site
Privacy | Copyright | Feedback | Credits | Advanced Search | Site Map
Companies
'39 '40 '41 '42 '43 '44 '45 '46 '47 '48
Components
'49 '50 '51 '52 '53 '54 '55 '56 '57 '58 Computers
Graphics & Games
'59 '60 '61 '62 '63 '64 '65 '66 '67 '68
Networking
'69 '70 '71 '72 '73 '74 '75 '76 '77 '78 People & Pop Culture
Robots & Artificial
'79 '80 '81 '82 '83 '84 '85 '86 '87 '88 Intelligence
Software & Languages
'89 '90 '91 '92 '93 '94
Storage
Software
&
Languages
1945
Konrad Zuse began work on Plankalkul (Plan Calculus), the first
algorithmic programming language, with an aim of creating the
theoretical preconditions for the formulation of problems of a general
nature. Seven years earlier, Zuse had developed and built the world´s
first binary digital computer, the Z1. He completed the first fully
functional program-controlled electromechanical digital computer, the Z3,
in 1941. Only the Z4 — the most sophisticated of his creations —
survived World War II.
Konrad Zuse
1948
Claude Shannon´s "The Mathematical Theory of Communication" showed
engineers how to code data so they could check for accuracy after
transmission between computers. Shannon identified the bit as the
fundamental unit of data and, coincidentally, the basic unit of
computation.
Claude Shannon
1952
Grace Hopper completes the A-0 Compiler. In 1952, mathematician
Grace Hopper completed what is considered to be the first compiler, a
program that allows a computer user to use English-like words instead of
numbers. Other compilers based on A-0 followed: ARITH-MATIC, MATH-
MATIC and FLOW-MATIC [software]
1953
John Backus completed speedcoding for IBM´s 701 computer. Although
speedcoding demanded more memory and compute time, it trimmed
weeks off of the programming schedule.
IBM 701
1955
Herbert Simon and Allen Newell unveiled Logic Theorist software that
supplied rules of reasoning and proved symbolic logic theorems. The
release of Logic Theorist marked a milestone in establishing the field of
artificial intelligence.
1956
In the mid-fifties resources for scientific and engineering computing were
in short supply and were very precious. The first operating system for the
IBM 704 reflected the cooperation of Bob Patrick of General Motors
Research and Owen Mock of North American Aviation. Called the GM-NAA
I/O System, it provided batch processing and increased the number of
completed jobs per shift with no increase in cost. Some version of the
system was used in about forty 704 installations.
MIT Whirlwind
1957
Sperry Rand released a commercial compiler for its UNIVAC. Developed
by Grace Hopper as a refinement of her earlier innovation, the A-0
compiler, the new version was called MATH-MATIC. Earlier work on the A-
0 and A-2 compilers led to the development of the first English-language
business data processing compiler, B-0 (FLOW-MATIC), also completed in
1957. FLOW-MATIC served as a model on which to build with input from
other sources.
UNIVAC MATH-MATIC
1959
ERMA, the Electronic Recording Method of Accounting, digitized checking
for the Bank of America by creating a computer-readable font. A special
scanner read account numbers preprinted on checks in magnetic ink.
ERMA characters
1960
A team drawn from several computer manufacturers and the Pentagon
developed COBOL, Common Business Oriented Language. Designed for
business use, early COBOL efforts aimed for easy readability of computer
programs and as much machine independence as possible. Designers
hoped a COBOL program would run on any computer for which a compiler
existed with only minimal modifications.
LISP made its debut as the first computer language designed for writing
artificial intelligence programs. Created by John McCarthy, LISP offered
programmers flexibility in organization.
1962
MIT students Slug Russell, Shag Graetz, and Alan Kotok wrote
SpaceWar!, considered the first interactive computer game. First played
at MIT on DEC´s PDP-1, the large-scope display featured interactive,
shoot´em-up graphics that inspired future video games. Dueling players
fired at each other´s spaceships and used early versions of joysticks to
manipulate away from the central gravitational force of a sun as well as
from the enemy ship.
Spacewar! on PDP-1
1963
Ivan Sutherland published Sketchpad, an interactive, real time computer
drawing system, as his MIT doctoral thesis. Using a light pen and
Sketchpad, a designer could draw and manipulate geometric figures on
the screen.
Sketchpad document
ASCII code
1964
Thomas Kurtz and John Kemeny created BASIC, an easy-to-learn
programming language, for their students at Dartmouth College.
BASIC manual
1965
Object-oriented languages got an early boost with Simula, written by
Kristen Nygaard and Ole-John Dahl. Simula grouped data and instructions
into blocks called objects, each representing one facet of a system
intended for simulation.
1967
Seymour Papert designed LOGO as a computer language for children.
Initially a drawing program, LOGO controlled the actions of a mechanical
"turtle," which traced its path with pen on paper. Electronic turtles made
their designs on a video display monitor.
Seymour Papert
1968
Edsger Dijkstra´s "GO TO considered harmful" letter, published in
Communications of the ACM, fired the first salvo in the structured
programming wars. The ACM considered the resulting acrimony
sufficiently harmful that it established a policy of no longer printing
articles taking such an assertive position against a coding practice.
1969
The RS-232-C standard for communication permitted computers and
peripheral devices to transmit information serially — that is, one bit at a
time. The RS-232-C protocol spelled out a purpose for a serial plug´s 25
connector pins.
AT&T Bell Laboratories programmers Kenneth Thompson and Dennis
Ritchie developed the UNIX operating system on a spare DEC
minicomputer. UNIX combined many of the timesharing and file
management features offered by Multics, from which it took its name.
(Multics, a projects of the mid-1960s, represented the first effort at
creating a multi-user, multi-tasking operating system.) The UNIX
operating system quickly secured a wide following, particularly among
engineers and scientists.
1972
Nolan Bushnell introduced Pong and his new company, Atari video games.
1976
Gary Kildall developed CP/M, an operating system for personal
computers. Widely adopted, CP/M made it possible for one version of a
program to run on a variety of computers built around eight-bit
microprocessors.
CP/MCP/M
1977
The U.S. government adopted IBM´s data encryption standard, the key to
unlocking coded messages, to protect confidentiality within its agencies.
Available to the general public as well, the standard required an eight-
number key for scrambling and unscrambling data. The 70 quadrillion
possible combinations made breaking the code by trial and error unlikely.
1979
Harvard MBA candidate Daniel Bricklin and programmer Robert Frankston
developed VisiCalc, the program that made a business machine of the
personal computer, for the Apple II. VisiCalc (for Visible Calculator)
automated the recalculation of spreadsheets. A huge success, more than
100,000 copies sold in one year.
1982
Mitch Kapor developed Lotus 1-2-3, writing the software directly into the
video system of the IBM PC. By bypassing DOS, it ran much faster than
its competitors. Along with the immense popularity of the IBM´s
computer, Lotus owed much of its success to its working combination of
spreadsheet capabilities with graphics and data retrieval capabilities.
Lotus 1-2-3
1983
Microsoft announced Word, originally called Multi-Tool Word, and
Windows. The latter doesn´t ship until 1985, although the company said
it would be on track for an April 1984 release. In a marketing blitz,
Microsoft distributed 450,000 disks demonstrating its Word program in
the November issue of PC World magazine.
1985
Aldus announced its PageMaker program for use on Macintosh
computers, launching an interest in desktop publishing. Two years later,
Aldus released a version for IBMs and IBM-compatible computers.
Developed by Paul Brainerd, who founded Aldus Corp., PageMaker
allowed users to combine graphics and text easily enough to make
desktop publishing practical.
Aldus PageMaker
1987
Apple engineer William Atkinson designed HyperCard, a software tool that
simplifies development of in-house applications. HyperCard differed from
previous programs of its sort because Atkinson made it interactive rather
than language-based and geared it toward the construction of user
interfaces rather than the processing of data. In HyperCard, programmers
built stacks with the concept of hypertext links between stacks of pages.
Apple distributed the program free with Macintosh computers until 1992.
1989
Maxis released SimCity, a video game that helped launch of series of
simulators. Maxis cofounder Will Wright built on his childhood interest in
plastic models of ships and airplanes, eventually starting up a company
with Jeff Braun and designing a computer program that allowed the user
to create his own city. A number of other Sims followed in the series,
including SimEarth, SimAnt, and SimLife.
1990
Microsoft shipped Windows 3.0 on May 22. Compatible with DOS
programs, the first successful version of Windows finally offered good
enough performance to satisfy PC users. For the new version, Microsoft
revamped the interface and created a design that allowed PCs to support
large graphical applications for the first time. It also allowed multiple
programs to run simultaneously on its Intel 80386 microprocessor.
1991
The Linux operating system is introduced. Designed by Finnish university
student Linus Torvalds, Linux was released to several Usenet newsgroups
on September 17th, 1991. Almost immediately, enthusiasts began
developing and improving Linux, such as adding support for peripherals
and improving its stability. Linux is now one of several open source Unix-
like operating systems.
T
i
m
e
li
n
e
o
f
C
o
m
p
u
t
e
r
H
i
s
t
o
r
y
About the Museum | Exhibits | Collections | What's Happening | Giving | About This Site
Privacy | Copyright | Feedback | Credits | Advanced Search | Site Map
Storage
1956
The era of magnetic disk storage dawned with IBM´s shipment of a 305
RAMAC to Zellerbach Paper in San Francisco. The IBM 350 disk file served
as the storage component for the Random Access Method of Accounting
and Control. It consisted of 50 magnetically coated metal platters with 5
million bytes of data. The platters, stacked one on top of the other,
rotated with a common drive shaft.
1961
IBM 1301 Disk Storage Unit is released. The IBM 1301 Disk Drive was
announced on June 2nd, 1961 for use with IBM’s 7000-series of
mainframe computers. Maximum capacity was 28 million characters and
the disks rotated at 1,800 R.P.M. The 1301 leased for $2,100 per month
or could be purchased for $115,500. The drive had one read/write arm
for each disk as well as flying heads, both of which are still used in
today’s disk drives.
1967
IBM 1360 Photo-Digital Storage System is delivered. In 1967, IBM
delivered the first of its photo-digital storage systems to Lawrence
Livermore National Laboratory. The system could read and write up to a
trillion bits of information—the first such system in the world.. The 1360
used thin strips of film which were developed with bit patterns via a
photographic developing system housed in the machine. The system used
sophisticated error correction and a pneumatic robot to move the film
strips to and from a storage unit. Only five were built.
1971
An IBM team, originally led by David Noble, invented the 8-inch floppy
diskette. It was initially designed for use in loading microcode into the
controller for the "Merlin" (IBM 3330) disk pack file. It quickly won
widespread acceptance as a program and data-storage medium. Unlike
hard drives, a user could easily transfer a floppy in its protective jacket
from one drive to another.
IBM 23FD 8
1978
The 5 1/4" flexible disk drive and diskette were introduced by Shugart
Associates in 1976. This was the result of a request by Wang Laboratories
to produce a disk drive small enough to use with a desktop computer,
since 8" floppy drives were considered too large for that purpose. By
1978, more than 10 manufacturers were producing 5 1/4" floppy drives.
1980
Seagate Technology created the first hard disk drive for microcomputers,
the ST506. The disk held 5 megabytes of data, five times as much as a
standard floppy disk, and fit in the space of a floppy disk drive. The hard
disk drive itself is a rigid metallic platter coated on both sides with a thin
Shugart and Finis Conner, who had worked together at IBM. The two men
Shugart ST506 5MB Hard Disk decided to found the company after developing the idea of scaling down a
Drive hard disk drive to the same size as the then-standard 5 1/4-inch floppies.
Upon releasing its first product, Seagate quickly drew such big-name
customers as Apple Computer and IBM. Within a few years, it had sold 4
million units.
1981
The first optical data storage disk had 60 times the capacity of a 5 1/4-
inch floppy disk. Developed by Phillips, the disk stored data as marks
burned by a laser that could not be overwritten — making it useful for
years later, Phillips created an erasable optical disk using special material,
Sony introduced and shipped the first 3 1/2" floppy drives and diskettes
in 1981. The first signficant company to adopt the 3 1/2" floppy for
general use was Hewlett-Packard in 1982, an event which was critical in
establishing momentum for the format and which helped it prevail over
the other contenders for the microfloppy standard, including 3", 3 1/4",
and 3.9" formats.
Sony 3 1/2
1983
The Bernoulli Box is released. Using disks that included the read/write
head inside them, the Bernoulli Box was a special type of disk drive that
allowed people to move large files between computers when few
alternatives (such as a network) existed. Allowing for almost twenty
times the amount of storage afforded by a regular floppy disk, the
cartridges came in capacities ranging from 35MB to 230MB.
Original Bernoulli Box
1984
Able to hold 550 megabytes of prerecorded data, CD-ROMs grew out of
regular CDs on which music is recorded. The first general-interest CD-
ROM product released after Philips and Sony announced the CD-ROM in
1984 was "Grolier´s Electronic Encyclopedia," which came out in 1985.
The 9 million words in the encyclopedia only took up 12 percent of the
available space. The same year, computer and electronics companies
worked together to set a standard for the disks so any computer would be
able to access the information.
1994
The Iomega Zip Disk is released. The Zip Disk was based on Iomega’s
Bernoulli Box system of file storage that used special removable disk
packs. Using the Bernoulli system as a model, the initial Zip system
allowed 100MB to be stored on a cartridge roughly the size of a 3 ½ inch
floppy disk. Later versions increased the capacity of a single disk from
100Mbytes to 2GB.
T
i
m
e
li
n
e
o
f
C
o
m
p
u
t
e
r
H
i
s
t
o
r
y
About the Museum | Exhibits | Collections | What's Happening | Giving | About This Site
Privacy | Copyright | Feedback | Credits | Advanced Search | Site Map
The history of MS-DOS is surprisingly long. It started off as QDOS (Quick and Dirty
Operating System) which was developed by Seattle Computer Products to run on IBM's
new PC. This list is fairly comprehensive although a number of the more obscure
versions of DOS have been omitted.
Microsoft
first began development of the Interface Manager (subsequently renamed Microsoft
Windows) in September 1981.
Although the first prototypes used Multiplan and Word-like menus at the bottom of the
screen, the interface was changed in 1982 to use pull-down menus and dialogs, as used
on the Xerox Star.
Microsoft finally announced Windows in November 1983, with pressure from just-
released VisiOn and impending TopView.
This was after the release of the Apple Lisa, and before Digital Research announced
GEM, and DESQ from Quarterdeck and the Amiga Workbench , or
GEOS/GeoWorks Ensemble, IBM OS/2, NeXTstep or even
DeskMate from Tandy.
Windows promised an easy-to-use graphical interface, device-independent graphics and
multitasking support.
The development was delayed several times, however, and the Windows 1.0 hit the store
shelves in November 1985. The selection of applications was sparse, however, and
Windows sales were modest.
Click here for more photos off WINDOWS 1.01 Thanks to Oliver Schade.
Windows/286 >>
Windows 3.0, released in May, 1990, was a complete overhaul of the Windows
environment. With the capability to address memory beyond 640K and a much more
powerful user interface, independent software vendors started developing Windows
applications with vigor. The powerful new applications helped Microsoft sell more than
10 million copies of Windows, making it the best-selling graphical user interface in the
history of computing. Windows 3.1
Windows CE has the look and feel of Windows 95 and NT. Users familiar with either of
these operating systems are able to instantly use Handheld PCs and Palm-size PCs.
Windows CE 1.0 devices appeared in November 1996. Over the next year,
approximately 500,000 Handheld PC units were sold worldwide.
Windows CE 2.0>>
Click here for more on the Windows CE/OS Windows CE 2.0 became
available in early 1998 addresses most of the problems experienced by Windows CE 1.0
users and also added features to the operating system that make it more viable for use by
corporate rather than home users.
Windows CE 3.0 Availability June 15, 2000 -- Embedded operating system and its
comprehensive development tools -- Platform Builder 3.0 and eMbedded Visual Tools
3.0 -- which enable developers to build rich embedded devices that demand dynamic
applications and Internet services. Windows CE 3.0 combines the flexibility and the
reliability of an embedded platform with the power of Windows and the Internet.
Windows NT 5.0 will include a host of new features. Like Windows 98, it will integrate
Internet Explorer 4.0 into the operating system. This new interface will be matched up
with the Distributed File System, which Microsoft says will provide "a logical way to
organize and navigate the huge volume of information an enterprise assembles on
servers, independent of where the servers are physically located.
As of november 1998, NT 5.0 will be known as Windows 2000, making NT a
"mainstream" operating system.
Feb. 17 2000, Windows 2000 provides an impressive platform of
Internet, intranet, extranet, and management applications that
integrate tightly with Active Directory. You can set up virtual
private networks - secure, encrypted connections across the
Internet - with your choice of protocol. You can encrypt data on
the network or on-disk. You can give users consistent access to
the same files and objects from any network-connected PC. You
can use the Windows Installer to distribute software to users over
the LAN.
Thursday Sep. 14, 2000 Microsoft released Windows Me, short
for Millenium Edition, which is aimed at the home user. The Me
operating system boasts some enhanced multimedia features, such
as an automated video editor and improved Internet plumbing. But
unlike Microsoft's Windows 2000 OS which offers advanced
security, reliability, and networking features Windows Me is
basically just an upgrade to the DOS-based code on which
previous Windows versions have been built.
WINDOWS XP
And in 2002 comes!
LindowsOS SPX - the first "Broadband OS"
LindowsOS SPX is designed to fully utilize the world of tomorrow, where Internet connectivity is
bountiful and cheap, and computers are ubiquitous. For tomorrow's computing needs, computer
users need a computing solution that's affordable and beneficial, a system where software is
digitally transmitted, easy to deploy and highly customizable. Computing needs to be effortless, so
people spend less time working on computers and more time having computers work for them;
LindowsOS SPX, the broadband operating system, does all of this.
LindowsOS SPX provides an advanced digital experience at an affordable price. Applications can
all be digitally downloaded and installed at the click of a mouse. Individual machines can be
customized quickly and easily.
LINDOWSOS BRIDGES THE GAP TO COMPUTER OWNERSHIP WITH MICROTEL PCs
AND WALMART.COM
Microtel Computer Systems, Pre-Installed with LindowsOS, to Cost Less Than $300 at
Walmart.com
SAN DIEGO –June 17, 2002 — Lindows.com, Inc., whose mantra has been “Bringing Choice to
Your Computer,” is now delivering on its promises of choice by partnering with Microtel
Computer Systems to ship Lindows.com’s Operating System, LindowsOS, pre-installed on their
personal computers. For less than $300, computer-buyers can take advantage of LindowsOS and
Microtel’s offering at Walmart.com (NYSE and PCX: WMT) therefore bringing computer
ownership closer to those with limited resources.
The LGP30 was built by Litton General Precision in the mid 1950's. It was
implemented with vacuum tubes and drum memory. It used a Flexowriter for I/O. The
instructions had three addresses, two for the operands and one for the next instruction.
. Digital Equipment Corporation's first computer
was the PDP-1.
Spacewar is the first video game and was written by Steve "Slug" Russell at
MIT in 1960-61.
The PDP-8 was the world's first minicomputer. It was priced at the amazingly low
price of $20,000.00.
2nd century, BC. The Hydraulis was invented by Ktesibios sometime in the
second century B.C. Ktesibios, the son of a Greek barber, was fascinated by
pneumatics and wrote an early treatise on the use of hydraulic systems for
powering mechanical devices. His most famous invention, the Hydraulis, used
water to regulate the air pressure inside an organ. A small cistern called the
pnigeus was turned upside down and placed inside a barrel of water. A set of
pumps forced air into the pnigeus, forming an air reservoir, and that air was
channeled up into the organ's action.
Greek Aeolian harp. This may be considered the first automatic instrument. It
was named for Aeolus, the Greek god of the wind. The instrument had two
bridges over which the strings passed. The instrument was placed in a window
where air current would pass, and the strings were activated by the wind current.
Rather than being of different lengths, the strings were all the same length and
tuned to the same pitch, but because of different string thicknesses, varying
pitches could be produced.
890 AD Banu Musa was an organ-building treatise; this was the first written
documentation of an automatic instrument.
Soggetto cavato, a technique of mapping letters of the alphabet into pitches, was
developed. This technique was used Josquin's Mass based on the name of
Hercules, the Duke of Ferrara. One application of soggetto cavato would involve
be to take the vowels in Hercules as follows: e=re=D; u=ut=C (in the solfege
system of do, re, mi, fa, etc., ut was the original do syllable); e=re=D. This pattern
of vowel-mapping could continue for first and last names, as well as towns and
cities.
1500s The first mechanically driven organs were built; water organs called
hydraulis were in existence.
1624 English philosopher and essayist, Francis Bacon wrote about a scientific
utopia in the New Atlantis. He stated "we have sound-houses, where we practice
and demonstrate all sounds, and their generation. We have harmonies which you
have not, of quarter-sounds, and less slides of sounds."
1787 Mozart composed the Musikalisches Wurfelspiel (Musical Dice Game). This
composition was a series of precomposed measures arranged in random eight-
bar phrases to build the composition. Each throw of a pair of dice represented an
individual measure, so after eight throws the first phrase was determined.
1796 Carillons, "a sliver of steel, shaped, polished, tempered and then screwed
into position so that the projections on a rotating cylinder could pluck at its free
extremity," were invented.
1830 Robert Schumann composer the Abegg Variations, op. 1. This composition
was named for one of his girlfriends. The principal theme is based on the letters
of her name: A-B-E-G-G--this was a later application of a soggetto cavato
technique.
1835 Schumann composed the Carnaval pieces, op. 9 , twenty-one short pieces
for piano. Each piece is based on a different character.
1876 Elisha Gray (an inventor of a telephone, along with Bell) invented the
Electroharmonic or Electromusical Piano; this instrument transmitted musical
tones over wires.
Koenig's Tonametric was invented. This instrument divided four octaves into 670
equal parts--this was an early instrument that made use of microtuning.
Emile Berliner (1851-1929) developed and patented the cylindrical and disc
phonograph system, simultaneously with Edison.
Dorr E. Felti, perfected a calculator with key-driven ratchet wheels which could
be moved by one or more teeth at a time.
1880 Alexander Graham Bell (1847-1922) financed his own laboratory in
Washington, D.C. Together with Charles S. Tainter, Bell devised and patented
several means for transmitting and recording sound.
1897 E.S. Votey invented the Pianola, an instrument that used a pre-punched,
perforated paper roll moved over a capillary bridge. The holes in the paper
corresponded to 88 openings in the board.
Lee De Forest (1873-1961) invented the Triode or Audion tube, the first vacuum
tube.
1907 Ferruccio Busoni (1866-1924) believed that the current musical system was
severely limited, so he stated that instrumental music was dead. His treatise on
aesthetics, Sketch of a New Music, discussed the future of music.
1910 The first radio broadcast in NYC (first radio station was built in 1920, also in
NYC).
1912 The Italian Futurist movement was founded by Luigi Russolo (1885-1947),
a painter, and Filippo Marinetti, a poet. Marinetti wrote the manifesto, Musica
Futurista; the Futurist Movement's creed was "To present the musical soul of the
masses, of the great factories, of the railways, of the transatlantic liners, of the
battleships, of the automobiles and airplanes. To add to the great central themes
of the musical poem the domain of the machines and the victorious kingdom of
Electricity."
Henry Cowell (1897-1965) introduced tone clusters in piano music. The Banshee
and Aeolian Harp are good examples.
1914 The first concert of Futurist music took place. The "art of noises" concert
was presented by Marinetti and Russolo in Milan, Italy.
1920 Lev (Leon) Theremin, Russia, invented the Aetherophone (later called the
Theremin or Thereminovox). The instrument used 2 vacuum tube oscillators to
produce beat notes. Musical sounds were created by "heterodyning" from
oscillators which varied pitch. A circuit was altered by changing the distance
between 2 elements. The instrument had a radio antenna to control dynamics
and a rod sticking out the side that controlled pitch. The performer would move
his/her hand along the rod to change pitch, while simultaneously moving his/her
other hand in proximity to the antenna. Many composers used this instrument
including Varese.
1926 Jorg Mager built an electronic instrument, the Spharophon. The instrument
was first presented at the Donaueschingen Festival (Rimsky-Korsakov composed
some experimental works for this instrument). Mager later developed a
Partiturophon and a Kaleidophon, both used in theatrical productions. All of these
instruments were destroyed in W.W.II.
1928 Maurice Martenot (b. 1928, France) built the Ondes Martenot (first called
the Ondes Musicales). The instrument used the same basic idea as the
Theremin, but instead of a radio antenna, it utilized a moveable electrode was
used to produce capacitance variants. Performers wore a ring that passed over
the keyboard. The instrument used subtractive synthesis. Composers such as
Honegger, Messiaen, Milhaud, Dutilleux, and Varese all composed for the
instrument.
1929 Laurens Hammond (b. 1895, USA), built instruments such as the
Hammond Organ, Novachord, Solovox, and reverb devices in the United States.
The Hammond Organ used 91 rotary electromagnetic disk generators driven by a
synchronous motor with associated gears and tone wheels. It used additive
synthesis.
1931 Ruth Crawford Seeger's String Quartet 1931 was composed. This is one of
the first works to employ extended serialism, a systematic organization of pitch,
rhythm, dynamics, and articulation.
Henry Cowell worked with Leon Theremin to build the Rhythmicon, an instrument
which could play metrical combinations of virtually unlimited complexity. With this
instrument Cowell composed the Rhythmicana Concerto.
Jorg Mager (Germany) was commissioned to create electronic bell sounds for
the Bayreuth production of Parsifal
1937 "War of the Worlds" was directed by Orson Welles. Welles was the first
director to use the fade and dissolve technique, first seen in "Citizen Kane." To
date, most film directors used blunt splices instead.
1944 Percy Grainger and Burnett Cross patented a machine that "freed" music
from the constraints of conventional tuning systems and rhythmic inadequacies
of human performers. Mechanical invention for composing "Free Music" used
eight oscillators and synchronizing equipment in conjunction with photo-sensitive
graph paper with the intention that the projected notation could be converted into
sound.
1947 Bell Labs developed and produced the solid state transistor.
Milton Babbitt's Three Compositions for Piano serialized all aspects of pitch,
rhythm, dynamics, and articulation.
1948 John Scott Trotter built a composition machine for popular music.
Hugh LeCaine (Canada) built the Electronic Sakbutt, an instrument that actually
sounded like a cello.
1940s The following instruments were built: the Electronium Pi (actually used by
a few German composers, including: Brehme, Degen, and Jacobi), the
Multimonica, the Polychord organ, the Tuttivox, the Marshall organ, and other
small electric organs.
1950 The Milan Studio was established by Luciano Berio (b. 1925, Italy).
1951-53 Eimert and Beyer (b. 1901) produced the first compositions using
electronically-generated pitches. The pieces used a mechanized device that
produced melodies based on Markov analysis of Stephen Foster tunes.
John Cage's 4'33" was composed. The composer was trying to liberate the
performer and the composer from having to make any conscious decisions,
therefore, the only sounds in this piece are those produce by the audience.
Louis and Bebe Baron set up a private studio in New York, and provided
soundtracks for sci-fi films like Forbidden Planet (1956) and Atlantis that used
electronic sound scores.
Otto Luening (b. 1900, USA; d. 1996, USA) and Vladimir Ussachevsky (b. 1911,
Manchuria; d. 1990, USA) present first concert at the Museum of Modern Art in
New York, October 28. The program included Ussachevsky's Sonic Contours
(created from piano recordings), and Luening's Fantasy in Space (using flute
recordings). Following the concert, they were asked to be on the Today Show
with Dave Garroway. Musicians Local 802 raised a fuss because Luening and
Ussachevsky were not members of the musicians' union.
1953-4 Karlheinz Stockhausen (b. 1928) used Helmholtz' research as the basis
of his Studie I and Studie II. He tried to build increasingly complex synthesized
sounds from simple pure frequencies (sine waves).
1954 The Cologne Radio Series "Music of Our Time" (October 19) used only
electronically-generated sounds by Stockhausen, Eimert, Pousseur, etc. The
pieces used strict serial techniques.
Dripsody was composed by Hugh LeCaine. The single sound source for this
concrete piece is a drip of water.
1955 Harry Olson and Belar, both working for RCA, invent the Electronic Music
Synthesizer, aka the Olson-Belar Sound Synthesizer. This synth used sawtooth
waves that were filtered for other types of timbres. The user programmed the
synthesizer with a typewriter-like keyboard that punched commands into a 40-
channel paper tape using binary code.
The Columbia-Princeton Studio started, with its beginnings mostly in the living
room of Ussachevsky and then the apartment of Luening.
Lejaren Hiller (1924-92) and Leonard Isaacson, from the University of Illinois
composed the Illiac String Quartet, the first piece of computer-generated music.
The piece was so named because it used a Univac computer and was composed
at the University of Illinois.
1956 Martin Klein and Douglas Bolitho used a Datatron computer called Push-
Button Bertha to compose music. This computer was used to compose popular
tunes; the tunes were derived from random numerical data that was sieved, or
mapped, into a preset tonal scheme.
Luening and Ussachevsky wrote incidental music for Orson Welles' King Lear ,
City Center, New York.
1957 Of Wood and Brass was composed by Luening. Sound sources included
trumpets, trombones and marimbas.
Scambi, composed by Henri Pousseur, was created at the Milan Studio, Italy.
1958 Edgard Varese (1883-1965) composed Poeme Electronique for the World's
Fair, Brussels. The work was composed for the Philips Pavilion, a building
designed by the famous architect, Le Corbusier who was assisted by Iannis
Xenakis (who later became well-known as a composer rather than an architect).
The work was performed on ca. 425 loudspeakers, and was accompanied by
projected images. This was truly one of the first large-scale multimedia
productions.
Iannis Xenakis (b.1922) composed Concret PH. This work was also composed
for the Brussels World's Fair. It made use of a single sound source: amplified
burning charcoal.
Pierre Henry leaves the Group de Musique Concrete; they reorganize as the
Groupe de Recherches Musicales (GRM)
Gordon Mumma and Robert Ashley founded the Cooperative Studio for
Electronic Music, Ann Arbor , MI (University of Michigan).
Morton Subotnick, Pauline Oliveros, and Ramon Sender established the San
Francisco Tape Music Center.
1961 The first electronic music concerts at the Columbia-Princeton Studio were
held; the music was received with much hostility from other faculty members.
Fortran-based Music IV was used in the generation of "Bicycle Built for Two"
(Mathews).
The production of integrated circuits and specifically VLSI-very large scale
integration.
Robert Moog met Herbert Deutsch, and together they created a voltage-
controlled synthesizer.
Luciano Berio composed Visage. This radio composition is based on the idea of
non-verbal communication. There are many word-like passages, but only one
word is spoken during the entire composition (actually heard twice), parole
(Italian for 'word'). Cathy Berberian, the composer's wife, was the performer.
1962 Bell Labs mass produces transistors, professional amplifiers and suppliers.
PLF 2 was developed by James Tenney. This computer program was used to
write Four Stochastic Studies, Ergodos and others.
At the University of Illinois, Kenneth Gaburo composed Antiphony III, for chorus
and tape.
Paul Ketoff built the synket. This synthesizer was built for composer John Eaton
and was designed specifically as a live performance instrument.
1963 Lejaren Hiller and Robert Baker composed the Computer Cantata.
Mario Davidovsky composed Synchronism I for flute and tape. Davidovsky has
since written many "synchronism" pieces. These works are all written for live
instrument(s) and tape. They explore the synchronizing of events between the
live and tape.
1964 The fully developed Moog was released. The modular idea came from the
miniaturization of electronics.
Gottfried Michael Koenig used PR-1 (Project 1), a computer program that was
written in Fortran and implemented on an IBM 7090 computer. The purpose of
the program was to provide data to calculate structure in musical composition;
written to perform algorithmic serial operations on incoming data. The second
version of PR-1 completed, 1965.
Varese died.
Steve Reich composed It's gonna rain. This is one of the first phase pieces.
1966 The Moog Quartet offered world-wide concerts of (mainly) parlor music.
1967 Walter Carlos (later Wendy) composed Switched on Bach using a Moog
synthesizer.
Leon Kirschner composed String Quartet No. 3, the first piece with electronics to
win the Pulitzer Prize.
Kenneth Gaburo composed Antiphony IV, a work for trombone, piccolo, choir and
tape.
Morton Subotnick composed Silver Apples of the Moon (title from Yeats), the first
work commissioned specifically for the recorded medium.
The Grateful Dead released Anthem of the Sun and Frank Zappa and the
Mothers of Invention released Uncle Meat. Both albums made extensive use of
electronic manipulation.
late 1960s The Sal-Mar Construction was built. The instrument was named for
composer Salvatore Martirano and designed by him. The Sal-Mar Construction
weighed over fifteen hundred pounds and consisted of "analog circuits controlled
by internal digital circuits controlled by the composer/performer via a touch-
control keyboard with 291 touch-sensitive keys."
Godfrey Winham and Hubert Howe adapted MUSIC IV for the IBM 7094 as
MUSIC4B was written in assembly language; MUSIC4BF (a Fortran-language
adaptation of MUSIC4B, one version was written by Winham, another was
written by Howe).
Music V variants include MUSIC360 and MUSIC11 for the IBM360 and the
PDP11 computers, these were written by Barry Vercoe, Roger Hale, and Carl
Howe at MIT, respectively.
GROOVE was developed by Mathews and F. Richard Moore at Bell Labs, and
was used to control analog synthesizers.
1970 Charles Wuorinen composed "Times Encomium," the first Pulitzer Prize
winner for entirely electronic composition.
1972 Pink Floyd's album The Dark Side of the Moon was released; it used
ensembles of synthesizers, also used concrete tracks as interludes between
tunes.
1973 SAWDUST, a language by Herbert Brun, used functions including:
ELEMENT, LINK, MINGLE, MERGER, VARY, and TURN.
1974 The Mellotron was built. The instrument was an early sample player that
used tape loops. There were versions that played string sounds or flute sounds,
and the instrument was used in movie soundtracks and on recordings.
1980 Philip Glass composed Satyagraha, another full scale opera in the
minimalist style.
1985 HMSL, Hierarchical Music Specification Language was released. The basic
organization of HMSL is a series of data structures called "morphs" (named for
the flexible or morphological design of the software). Within the superstructure of
these morphs there exist other data substructures named shapes, collections,
structures, structures, productions, jobs, players, and actions. These secondary
types of morphs are used to control aspects of higher level scheduling and
routines.
Interactor, by Morton Subotnick and Mark Coniglio, was designed specifically for
live performance and score-following capabilities.
The Max program was written in the C language and was developed at IRCAM
by Miller Puckette. It was later scheduled for distribution by Intelligent Music (the
company that also distributed M and Jam Factory), but it was the Opcode
company that eventually released it. Miller Puckette's original intention was to
build a language that could control IRCAM's 4X synthesizer, and there was no
need for the graphical implementation. The graphics were added after a version
of Max for Macintosh computer using MIDI was proposed. Since 1989, David
Zicarelli has updated and expanded the program for the Macintosh environment.
Dolby SR introduced
1988 Steve Reich composed Different Trains for string quartet and tape.
Emagic founded
Macromedia founded
The first computers were people! That is, electronic computers (and the earlier
mechanical computers) were given this name because they performed the work
that had previously been assigned to people. "Computer" was originally a job
title: it was used to describe those human beings (predominantly women) whose
job it was to perform the repetitive calculations required to compute such things
as navigational tables, tide charts, and planetary positions for astronomical
almanacs. Imagine you had a job where hour after hour, day after day, you were
to do nothing but compute multiplications. Boredom would quickly set in, leading
to carelessness, leading to mistakes. And even on your best days you wouldn't
be producing answers very fast. Therefore, inventors have been searching for
hundreds of years for a way to mechanize (that is, find a mechanism that can
perform) this task.
The abacus was an early aid for mathematical computations. Its only value is
that it aids the memory of the human performing the calculation. A skilled abacus
operator can work on addition and subtraction problems at the speed of a person
equipped with a hand calculator (multiplication and division are slower). The
abacus is often wrongly attributed to China. In fact, the oldest surviving abacus
was used in 300 B.C. by the Babylonians. The abacus is still in use today,
principally in the far east. A modern abacus consists of rings that slide over rods,
but the older one pictured below dates from the time when pebbles were used for
counting (the word "calculus" comes from the Latin word for pebble).
In 1617 an eccentric (some say mad) Scotsman named John Napier invented
logarithms, which are a technology that allows multiplication to be performed via
addition. The magic ingredient is the logarithm of each operand, which was
originally obtained from a printed table. But Napier also invented an alternative to
tables, where the logarithm values were carved on ivory sticks which are now
called Napier's Bones.
A slide rule
The first gear-driven calculating machine to actually be built was probably the
calculating clock, so named by its inventor, the German professor Wilhelm
Schickard in 1623. This device got little publicity because Schickard died soon
afterward in the bubonic plague.
Schickard's Calculating Clock
In 1642 Blaise Pascal, at age 19, invented the Pascaline as an aid for his father
who was a tax collector. Pascal built 50 of this gear-driven one-function
calculator (it could only add) but couldn't sell many because of their exorbitant
cost and because they really weren't that accurate (at that time it was not
possible to fabricate gears with the required precision). Up until the present age
when car dashboards went digital, the odometer portion of a car's speedometer
used the very same mechanism as the Pascaline to increment the next wheel
after each full revolution of the prior wheel. Pascal was a child prodigy. At the
age of 12, he was discovered doing his version of Euclid's thirty-second
proposition on the kitchen floor. Pascal went on to invent probability theory, the
hydraulic press, and the syringe. Shown below is an 8 digit version of the
Pascaline, and two views of a 6 digit version:
Pascal's Pascaline [photo © 2002 IEEE]
A 6 digit model for those who couldn't afford the 8 digit model
A Pascaline opened up so you can observe the gears and cylinders
which rotated to display the numerical result
Click on the "Next" hyperlink below to read about the punched card system that
was developed for looms for later applied to the U.S. census and then to
computers...
Just a few years after Pascal, the German Gottfried Wilhelm Leibniz (co-inventor
with Newton of calculus) managed to build a four-function (addition, subtraction,
multiplication, and division) calculator that he called the stepped reckoner
because, instead of gears, it employed fluted drums having ten flutes arranged
around their circumference in a stair-step fashion. Although the stepped reckoner
employed the decimal number system (each drum had 10 flutes), Leibniz was the
first to advocate use of the binary number system which is fundamental to the
operation of modern computers. Leibniz is considered one of the greatest of the
philosophers but he died poor and alone.
In 1801 the Frenchman Joseph Marie Jacquard invented a power loom that could
base its weave (and hence the design on the fabric) upon a pattern automatically
read from punched wooden cards, held together in a long row by rope.
Descendents of these punched cards have been in use ever since (remember
the "hanging chad" from the Florida presidential ballots of the year 2000?).
Jacquard's Loom showing the threads and the punched cards
By selecting particular cards for Jacquard's loom you defined the
woven pattern [photo © 2002 IEEE]
A close-up of a Jacquard card
Jacquard's technology was a real boon to mill owners, but put many loom
operators out of work. Angry mobs smashed Jacquard looms and once attacked
Jacquard himself. History is full of examples of labor unrest following
technological innovation yet most studies show that, overall, technology has
actually increased the number of jobs.
By 1822 the English mathematician Charles Babbage was proposing a steam
driven calculating machine the size of a room, which he called the Difference
Engine. This machine would be able to compute tables of numbers, such as
logarithm tables. He obtained government funding for this project due to the
importance of numeric tables in ocean navigation. By promoting their commercial
and military navies, the British government had managed to become the earth's
greatest empire. But in that time frame the British government was publishing a
seven volume set of navigation tables which came with a companion volume of
corrections which showed that the set had over 1000 numerical errors. It was
hoped that Babbage's machine could eliminate errors in these types of tables.
But construction of Babbage's Difference Engine proved exceedingly difficult and
the project soon became the most expensive government funded project up to
that point in English history. Ten years later the device was still nowhere near
complete, acrimony abounded between all involved, and funding dried up. The
device was never finished.
A small section of the type of mechanism employed in Babbage's
Difference Engine [photo © 2002 IEEE]
Babbage was not deterred, and by then was on to his next brainstorm, which he
called the Analytic Engine. This device, large as a house and powered by 6
steam engines, would be more general purpose in nature because it would be
programmable, thanks to the punched card technology of Jacquard. But it was
Babbage who made an important intellectual leap regarding the punched cards.
In the Jacquard loom, the presence or absence of each hole in the card
physically allows a colored thread to pass or stops that thread (you can see this
clearly in the earlier photo). Babbage saw that the pattern of holes could be used
to represent an abstract idea such as a problem statement or the raw data
required for that problem's solution. Babbage saw that there was no requirement
that the problem matter itself physically pass thru the holes.
The Analytic Engine also had a key function that distinguishes computers from
calculators: the conditional statement. A conditional statement allows a program
to achieve different results each time it is run. Based on the conditional
statement, the path of the program (that is, what statements are executed next)
can be determined based upon a condition or situation that is detected at the
very moment the program is running.
Babbage befriended Ada Byron, the daughter of the famous poet Lord Byron
(Ada would later become the Countess Lady Lovelace by marriage). Though she
was only 19, she was fascinated by Babbage's ideas and thru letters and
meetings with Babbage she learned enough about the design of the Analytic
Engine to begin fashioning programs for the still unbuilt machine. While Babbage
refused to publish his knowledge for another 30 years, Ada wrote a series of
"Notes" wherein she detailed sequences of instructions she had prepared for the
Analytic Engine. The Analytic Engine remained unbuilt (the British government
refused to get involved with this one) but Ada earned her spot in history as the
first computer programmer. Ada invented the subroutine and was the first to
recognize the importance of looping. Babbage himself went on to invent the
modern postal system, cowcatchers on trains, and the ophthalmoscope, which is
still used today to treat the eye.
The next breakthrough occurred in America. The U.S. Constitution states that a
census should be taken of all U.S. citizens every 10 years in order to determine
the representation of the states in Congress. While the very first census of 1790
had only required 9 months, by 1880 the U.S. population had grown so much that
the count for the 1880 census took 7.5 years. Automation was clearly needed for
the next census. The census bureau offered a prize for an inventor to help with
the 1890 census and this prize was won by Herman Hollerith, who proposed and
then successfully adopted Jacquard's punched cards for the purpose of
computation.
Hollerith built a company, the Tabulating Machine Company which, after a few
buyouts, eventually became International Business Machines, known today as
IBM. IBM grew rapidly and punched cards became ubiquitous. Your gas bill
would arrive each month with a punch card you had to return with your payment.
This punch card recorded the particulars of your account: your name, address,
gas usage, etc. (I imagine there were some "hackers" in these days who would
alter the punch cards to change their bill). As another example, when you
entered a toll way (a highway that collects a fee from each driver) you were given
a punch card that recorded where you started and then when you exited from the
toll way your fee was computed based upon the miles you drove. When you
voted in an election the ballot you were handed was a punch card. The little
pieces of paper that are punched out of the card are called "chad" and were
thrown as confetti at weddings. Until recently all Social Security and other checks
issued by the Federal government were actually punch cards. The check-out slip
inside a library book was a punch card. Written on all these cards was a phrase
as common as "close cover before striking": "do not fold, spindle, or mutilate". A
spindle was an upright spike on the desk of an accounting clerk. As he
completed processing each receipt he would impale it on this spike. When the
spindle was full, he'd run a piece of string through the holes, tie up the bundle,
and ship it off to the archives. You occasionally still see spindles at restaurant
cash registers.
But the U.S. military desired a mechanical calculator more optimized for scientific
computation. By World War II the U.S. had battleships that could lob shells
weighing as much as a small car over distances up to 25 miles. Physicists could
write the equations that described how atmospheric drag, wind, gravity, muzzle
velocity, etc. would determine the trajectory of the shell. But solving such
equations was extremely laborious. This was the work performed by the human
computers. Their results would be published in ballistic "firing tables" published in
gunnery manuals. During World War II the U.S. military scoured the country
looking for (generally female) math majors to hire for the job of computing these
tables. But not enough humans could be found to keep up with the need for new
tables. Sometimes artillery pieces had to be delivered to the battlefield without
the necessary firing tables and this meant they were close to useless because
they couldn't be aimed properly. Faced with this situation, the U.S. military was
willing to invest in even hair-brained schemes to automate this type of
computation.
One early success was the Harvard Mark I computer which was built as a
partnership between Harvard and IBM in 1944. This was the first programmable
digital computer made in the U.S. But it was not a purely electronic computer.
Instead the Mark I was constructed out of switches, relays, rotating shafts, and
clutches. The machine weighed 5 tons, incorporated 500 miles of wire, was 8
feet tall and 51 feet long, and had a 50 ft rotating shaft running its length, turned
by a 5 horsepower electric motor. The Mark I ran non-stop for 15 years, sounding
like a roomful of ladies knitting. To appreciate the scale of this machine note the
four typewriters in the foreground of the following photo.
The Harvard Mark I: an electro-mechanical computer
You can see the 50 ft rotating shaft in the bottom of the prior photo. This shaft
was a central power source for the entire machine. This design feature was
reminiscent of the days when waterpower was used to run a machine shop and
each lathe or other tool was driven by a belt connected to a single overhead shaft
which was turned by an outside waterwheel.
A central shaft driven by an outside waterwheel and connected to
each machine by overhead belts was the customary power
source for all the machines in a factory
Here's a close-up of one of the Mark I's four paper tape readers. A paper tape
was an improvement over a box of punched cards as anyone who has ever
dropped -- and thus shuffled -- his "stack" knows.
One of the four paper tape readers on the Harvard Mark I (you can
observe the punched paper roll emerging from the bottom)
One of the primary programmers for the Mark I was a woman, Grace Hopper.
Hopper found the first computer "bug": a dead moth that had gotten into the Mark
I and whose wings were blocking the reading of the holes in the paper tape. The
word "bug" had been used to describe a defect since at least 1889 but Hopper is
credited with coining the word "debugging" to describe the work to eliminate
program faults.
In 1953 Grace Hopper invented the first high-level language, "Flow-matic". This
language eventually became COBOL which was the language most affected by
the infamous Y2K problem. A high-level language is designed to be more
understandable by humans than is the binary language understood by the
computing machinery. A high-level language is worthless without a program --
known as a compiler -- to translate it into the binary language of the computer
and hence Grace Hopper also constructed the world's first compiler. Grace
remained active as a Rear Admiral in the Navy Reserves until she was 79
(another record).
The Mark I operated on numbers that were 23 digits wide. It could add or
subtract two of these numbers in three-tenths of a second, multiply them in four
seconds, and divide them in ten seconds. Forty-five years later computers could
perform an addition in a billionth of a second! Even though the Mark I had three
quarters of a million components, it could only store 72 numbers! Today, home
computers can store 30 million numbers in RAM and another 10 billion numbers
on their hard disk. Today, a number can be pulled from RAM after a delay of only
a few billionths of a second, and from a hard disk after a delay of only a few
thousandths of a second. This kind of speed is obviously impossible for a
machine which must move a rotating shaft and that is why electronic computers
killed off their mechanical predecessors.
The Apple 1 which was sold as a do-it-yourself kit (without the lovely
case seen here)
Computers had been incredibly expensive because they required so much hand
assembly, such as the wiring seen in this CDC 7600:
Typical wiring in an early mainframe computer [photo courtesy The
Computer Museum]
The microelectronics revolution is what allowed the amount of hand-crafted
wiring seen in the prior photo to be mass-produced as an integrated circuit
which is a small sliver of silicon the size of your thumbnail .
An integrated circuit ("silicon chip") [photo courtesy of IBM]
The primary advantage of an integrated circuit is not that the transistors
(switches) are miniscule (that's the secondary advantage), but rather that millions
of transistors can be created and interconnected in a mass-production process.
All the elements on the integrated circuit are fabricated simultaneously via a
small number (maybe 12) of optical masks that define the geometry of each
layer. This speeds up the process of fabricating the computer -- and hence
reduces its cost -- just as Gutenberg's printing press sped up the fabrication of
books and thereby made them affordable to all.
The IBM Stretch computer of 1959 needed its 33 foot length to hold the 150,000
transistors it contained. These transistors were tremendously smaller than the
vacuum tubes they replaced, but they were still individual elements requiring
individual assembly. By the early 1980s this many transistors could be
simultaneously fabricated on an integrated circuit. Today's Pentium 4
microprocessor contains 42,000,000 transistors in this same thumbnail sized
piece of silicon.
It's humorous to remember that in between the Stretch machine (which would be
called a mainframe today) and the Apple I (a desktop computer) there was an
entire industry segment referred to as mini-computers such as the following
PDP-12 computer of 1969:
The DEC PDP-12
Sure looks "mini", huh? But we're getting ahead of our story.
One of the earliest attempts to build an all-electronic (that is, no gears, cams,
belts, shafts, etc.) digital computer occurred in 1937 by J. V. Atanasoff, a
professor of physics and mathematics at Iowa State University. By 1941 he and
his graduate student, Clifford Berry, had succeeded in building a machine that
could solve 29 simultaneous equations with 29 unknowns. This machine was the
first to store data as a charge on a capacitor, which is how today's computers
store information in their main memory (DRAM or dynamic RAM). As far as its
inventors were aware, it was also the first to employ binary arithmetic. However,
the machine was not programmable, it lacked a conditional branch, its design
was appropriate for only one type of mathematical problem, and it was not further
pursued after World War II. It's inventors didn't even bother to preserve the
machine and it was dismantled by those who moved into the room where it lay
abandoned.
Another candidate for granddaddy of the modern computer was Colossus, built
during World War II by Britain for the purpose of breaking the cryptographic
codes used by Germany. Britain led the world in designing and building
electronic machines dedicated to code breaking, and was routinely able to read
coded Germany radio transmissions. But Colossus was definitely not a general
purpose, reprogrammable machine. Note the presence of pulleys in the two
photos of Colossus below:
Two views of the code-breaking Colossus of Great Britain
The Harvard Mark I, the Atanasoff-Berry computer, and the British Colossus all
made important contributions. American and British computer pioneers were still
arguing over who was first to do what, when in 1965 the work of the German
Konrad Zuse was published for the first time in English. Scooped! Zuse had built
a sequence of general purpose computers in Nazi Germany. The first, the Z1,
was built between 1936 and 1938 in the parlor of his parent's home.
The Zuse Z1 in its residential setting
Zuse's third machine, the Z3, built in 1941, was probably the first operational,
general-purpose, programmable (that is, software controlled) digital computer.
Without knowledge of any calculating machine inventors since Leibniz (who lived
in the 1600's), Zuse reinvented Babbage's concept of programming and decided
on his own to employ binary representation for numbers (Babbage had
advocated decimal). The Z3 was destroyed by an Allied bombing raid. The Z1
and Z2 met the same fate and the Z4 survived only because Zuse hauled it in a
wagon up into the mountains. Zuse's accomplishments are all the more
incredible given the context of the material and manpower shortages in Germany
during World War II. Zuse couldn't even obtain paper tape so he had to make his
own by punching holes in discarded movie film. Because these machines were
unknown outside Germany, they did not influence the path of computing in
America. But their architecture is identical to that still in use today: an arithmetic
unit to do the calculations, a memory for storing numbers, a control system to
supervise operations, and input and output devices to connect to the external
world. Zuse also invented what might be the first high-level computer language,
"Plankalkul", though it too was unknown outside Germany.
Click on the "Next" hyperlink below to read about Eniac, Univac, IBM
mainframes, and the IBM PC...
Previous Computer Science Lab Next
ENIAC filled a 20 by 40 foot room, weighed 30 tons, and used more than 18,000
vacuum tubes. Like the Mark I, ENIAC employed paper card readers obtained
from IBM (these were a regular product for IBM, as they were a long established
part of business accounting machines, IBM's forte). When operating, the ENIAC
was silent but you knew it was on as the 18,000 vacuum tubes each generated
waste heat like a light bulb and all this heat (174,000 watts of heat) meant that
the computer could only be operated in a specially designed room with its own
heavy duty air conditioning system. Only the left half of ENIAC is visible in the
first picture, the right half was basically a mirror image of what's visible.
Two views of ENIAC: the "Electronic Numerical Integrator and
Calculator" (note that it wasn't even given the name of
computer since "computers" were people) [U.S. Army photo]
To reprogram the ENIAC you had to rearrange the patch cords that you can
observe on the left in the prior photo, and the settings of 3000 switches that you
can observe on the right. To program a modern computer, you type out a
program with statements like:
Once the army agreed to fund ENIAC, Mauchly and Eckert worked around the
clock, seven days a week, hoping to complete the machine in time to contribute
to the war. Their war-time effort was so intense that most days they ate all 3
meals in the company of the army Captain who was their liaison with their military
sponsors. They were allowed a small staff but soon observed that they could hire
only the most junior members of the University of Pennsylvania staff because the
more experienced faculty members knew that their proposed machine would
never work.
One of the most obvious problems was that the design would require 18,000
vacuum tubes to all work simultaneously. Vacuum tubes were so notoriously
unreliable that even twenty years later many neighborhood drug stores provided
a "tube tester" that allowed homeowners to bring in the vacuum tubes from their
television sets and determine which one of the tubes was causing their TV to fail.
And television sets only incorporated about 30 vacuum tubes. The device that
used the largest number of vacuum tubes was an electronic organ: it
incorporated 160 tubes. The idea that 18,000 tubes could function together was
considered so unlikely that the dominant vacuum tube supplier of the day, RCA,
refused to join the project (but did supply tubes in the interest of "wartime
cooperation"). Eckert solved the tube reliability problem through extremely careful
circuit design. He was so thorough that before he chose the type of wire cabling
he would employ in ENIAC he first ran an experiment where he starved lab rats
for a few days and then gave them samples of all the available types of cable to
determine which they least liked to eat. Here's a look at a small number of the
vacuum tubes in ENIAC:
Even with 18,000 vacuum tubes, ENIAC could only hold 20 numbers at a time.
However, thanks to the elimination of moving parts it ran much faster than the
Mark I: a multiplication that required 6 seconds on the Mark I could be performed
on ENIAC in 2.8 thousandths of a second. ENIAC's basic clock speed was
100,000 cycles per second. Today's home computers employ clock speeds of
1,000,000,000 cycles per second. Built with $500,000 from the U.S. Army,
ENIAC's first task was to compute whether or not it was possible to build a
hydrogen bomb (the atomic bomb was completed during the war and hence is
older than ENIAC). The very first problem run on ENIAC required only 20
seconds and was checked against an answer obtained after forty hours of work
with a mechanical calculator. After chewing on half a million punch cards for six
weeks, ENIAC did humanity no favor when it declared the hydrogen bomb
feasible. This first ENIAC program remains classified even today.
Once ENIAC was finished and proved worthy of the cost of its development, its
designers set about to eliminate the obnoxious fact that reprogramming the
computer required a physical modification of all the patch cords and switches. It
took days to change ENIAC's program. Eckert and Mauchly's next teamed up
with the mathematician John von Neumann to design EDVAC, which pioneered
the stored program. Because he was the first to publish a description of this
new computer, von Neumann is often wrongly credited with the realization that
the program (that is, the sequence of computation steps) could be represented
electronically just as the data was. But this major breakthrough can be found in
Eckert's notes long before he ever started working with von Neumann. Eckert
was no slouch: while in high school Eckert had scored the second highest math
SAT score in the entire country.
After ENIAC and EDVAC came other computers with humorous names such as
ILLIAC, JOHNNIAC, and, of course, MANIAC. ILLIAC was built at the University
of Illinois at Champaign-Urbana, which is probably why the science fiction author
Arthur C. Clarke chose to have the HAL computer of his famous book "2001: A
Space Odyssey" born at Champaign-Urbana. Have you ever noticed that you can
shift each of the letters of IBM backward by one alphabet position and get HAL?
Today, one of the most notable characteristics of a computer is the fact that its
ability to be reprogrammed allows it to contribute to a wide variety of endeavors,
such as the following completely unrelated fields:
• the creation of special effects for movies,
• the compression of music to allow more minutes of music to fit within the
limited memory of an MP3 player,
• the observation of car tire rotation to detect and prevent skids in an anti-
lock braking system (ABS),
• the analysis of the writing style in Shakespeare's work with the goal of
proving whether a single individual really was responsible for all these
pieces.
By the end of the 1950's computers were no longer one-of-a-kind hand built
devices owned only by universities and government research labs. Eckert and
Mauchly left the University of Pennsylvania over a dispute about who owned the
patents for their invention. They decided to set up their own company. Their first
product was the famous UNIVAC computer, the first commercial (that is, mass
produced) computer. In the 50's, UNIVAC (a contraction of "Universal Automatic
Computer") was the household word for "computer" just as "Kleenex" is for
"tissue". The first UNIVAC was sold, appropriately enough, to the Census
bureau. UNIVAC was also the first computer to employ magnetic tape. Many
people still confuse a picture of a reel-to-reel tape recorder with a picture of a
mainframe computer.
A reel-to-reel tape drive [photo courtesy of The Computer Museum]
ENIAC was unquestionably the origin of the U.S. commercial computer industry,
but its inventors, Mauchly and Eckert, never achieved fortune from their work and
their company fell into financial problems and was sold at a loss. By 1955 IBM
was selling more computers than UNIVAC and by the 1960's the group of eight
companies selling computers was known as "IBM and the seven dwarfs". IBM
grew so dominant that the federal government pursued anti-trust proceedings
against them from 1969 to 1982 (notice the pace of our country's legal system).
You might wonder what type of event is required to dislodge an industry
heavyweight. In IBM's case it was their own decision to hire an unknown but
aggressive firm called Microsoft to provide the software for their personal
computer (PC). This lucrative contract allowed Microsoft to grow so dominant
that by the year 2000 their market capitalization (the total value of their stock)
was twice that of IBM and they were convicted in Federal Court of running an
illegal monopoly.
If you learned computer programming in the 1970's, you dealt with what today
are called mainframe computers, such as the IBM 7090 (shown below), IBM
360, or IBM 370.
A teletype was a motorized typewriter that could transmit your keystrokes to the
mainframe and then print the computer's response on its roll of paper. You typed
a single line of text, hit the carriage return button, and waited for the teletype to
begin noisily printing the computer's response (at a whopping 10 characters per
second). On the left-hand side of the teletype in the prior picture you can observe
a paper tape reader and writer (i.e., puncher). Here's a close-up of paper tape:
Paper tape has a long history as well. It was first used as an information storage
medium by Sir Charles Wheatstone, who used it to store Morse code that was
arriving via the newly invented telegraph (incidentally, Wheatstone was also the
inventor of the accordion).
The alternative to time sharing was batch mode processing, where the
computer gives its full attention to your program. In exchange for getting the
computer's full attention at run-time, you had to agree to prepare your program
off-line on a key punch machine which generated punch cards.
An IBM Key Punch machine which operates like a typewriter except it
produces punched cards rather than a printed sheet of paper
University students in the 1970's bought blank cards a linear foot at a time from
the university bookstore. Each card could hold only 1 program statement. To
submit your program to the mainframe, you placed your stack of cards in the
hopper of a card reader. Your program would be run whenever the computer
made it that far. You often submitted your deck and then went to dinner or to bed
and came back later hoping to see a successful printout showing your results.
Obviously, a program run in batch mode could not be interactive.
But things changed fast. By the 1990's a university student would typically own
his own computer and have exclusive use of it in his dorm room.
The original IBM Personal Computer (PC)
But a new Intel employee (Ted Hoff) convinced Busicom to instead accept a
general purpose computer chip which, like all computers, could be
reprogrammed for many different tasks (like controlling a keyboard, a display, a
printer, etc.). Intel argued that since the chip could be reprogrammed for
alternative purposes, the cost of developing it could be spread out over more
users and hence would be less expensive to each user. The general purpose
computer is adapted to each new purpose by writing a program which is a
sequence of instructions stored in memory (which happened to be Intel's forte).
Busicom agreed to pay Intel to design a general purpose chip and to get a price
break since it would allow Intel to sell the resulting chip to others. But
development of the chip took longer than expected and Busicom pulled out of the
project. Intel knew it had a winner by that point and gladly refunded all of
Busicom's investment just to gain sole rights to the device which they finished on
their own.
Thus became the Intel 4004, the first microprocessor (uP). The 4004 consisted of
2300 transistors and was clocked at 108 kHz (i.e., 108,000 times per second).
Compare this to the 42 million transistors and the 2 GHz clock rate (i.e.,
2,000,000,000 times per second) used in a Pentium 4. One of Intel's 4004 chips
still functions aboard the Pioneer 10 spacecraft, which is now the man-made
object farthest from the earth. Curiously, Busicom went bankrupt and never
ended up using the ground-breaking microprocessor.
Intel followed the 4004 with the 8008 and 8080. Intel priced the 8080
microprocessor at $360 dollars as an insult to IBM's famous 360 mainframe
which cost millions of dollars. The 8080 was employed in the MITS Altair
computer, which was the world's first personal computer (PC). It was personal
all right: you had to build it yourself from a kit of parts that arrived in the mail. This
kit didn't even include an enclosure and that is the reason the unit shown below
doesn't match the picture on the magazine cover.
The Altair 8800, the first PC
A Harvard freshman by the name of Bill Gates decided to drop out of college so
he could concentrate all his time writing programs for this computer. This early
experienced put Bill Gates in the right place at the right time once IBM decided to
standardize on the Intel microprocessors for their line of PCs in 1981. The Intel
Pentium 4 used in today's PCs is still compatible with the Intel 8088 used in
IBM's first PC.
If you've enjoyed this history of computers, I encourage you to try your own hand
at programming a computer. That is the only way you will really come to
understand the concepts of looping, subroutines, high and low-level languages,
bits and bytes, etc. I have written a number of Windows programs which teach
computer programming in a fun, visually-engaging setting. I start my students on
a programmable RPN calculator where we learn about programs, statements,
program and data memory, subroutines, logic and syntax errors, stacks, etc.
Then we move on to an 8051 microprocessor (which happens to be the most
widespread microprocessor on earth) where we learn about microprocessors,
bits and bytes, assembly language, addressing modes, etc. Finally, we graduate
to the most powerful language in use today: C++ (pronounced "C plus plus").
These Windows programs are accompanied by a book's worth of on-line
documentation which serves as a self-study guide, allowing you to teach yourself
computer programming! The home page (URL) for this collection of software is
www.computersciencelab.com.
Bibliography:
"ENIAC: The Triumphs and Tragedies of the World's First Computer" by Scott
McCartney.