Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

2 Why Build Computers? The Military Role in Computer Research

Download as pdf or txt
Download as pdf or txt
You are on page 1of 34

2

Why Build Computers?

The Military Role in Computer Research

On the battlefield of the future, enemy forces will be located, tracked, and
targeted almost instantaneously through the use of data links, computer
assisted intelligence evaluation, and automated fire control. With first round
kill probabilities approaching certainty, and with surveillance devices that
can continually track the enemy, the need for large forces to fix the opposition
physically will be less important. . . . [A]n improved communicative system . . .
would permit commanders to be continually aware of the entire battlefield
panorama down to squad and platoon level. . . . Today, machines and
technology are permitting economy of manpower on the battlefield. . . . But the
future offers even more possibilities for economy. I am confident the American
people expect this country to take full advantage of its technology -- to
welcome and applaud the developments that will replace wherever possible
the man with the machine. . . . With cooperative effort, no more than 10 years
should separate us from the automated battlefield.1

-- General William Westmoreland, former Commander-in-Chief of U.S.


forces in Vietnam, 1969

For two decades, from the early 1940s until the early 1960s, the armed forces of
the United States were the single most important driver of digital computer
development. Though most of the research work took place at universities and
in commercial firms, military research organizations such as the Office of Naval
Research, the Communications Security Group (known by its code name OP-20-
G), and the Air Comptroller’s Office paid for it. Military users became the
proving ground for initial concepts and prototype machines. As the commercial
computer industry began to take shape, the armed forces and the defense
industry served as the major marketplace. Most historical accounts recognize the
financial importance of this backing in early work on computers. But few, to
date, have grasped the deeper significance of this military involvement.

At the end of World War II, the electronic digital computer technology we
take for granted today was still in its earliest infancy. It was expensive, failure-
prone, and ill-understood. Digital computers were seen as calculators, useful
primarily for accounting and advanced scientific research. An alternative

1 General William Westmoreland, U.S. Army Chief of Staff, “Address to the Association of the
U.S. Army.” Reprinted in Paul Dickson, The Electronic Battlefield (Bloomington, IN: Indiana
University Press, 1976), 215–223.
technology, analog computing, was relatively cheap, reliable (if not terribly
accurate), better developed, and far better supported by both industrial and
academic institutions. For reasons we will explore below, analog computing was
more easily adapted to the control applications that constituted the major uses of
computers in battle. Only in retrospect does it appear obvious that command,
control, and communications should be united within a single technological
frame (to use Wiebe Bijker’s term) centered around electronic digital
computers.2

Why, then, did military agencies provide such lavish funding for digital
computer research and development? What were their near-term goals and
long-term visions, and how were these coupled to the grand strategy and
political culture of the Cold War? How were those goals and visions shaped over
time, as computers moved out of laboratories and into rapidly changing military
systems?

I will argue that military support for computer research was rarely benign
or disinterested, as many historians, taking at face value the public postures of
funding agencies and the reports of project leaders, have assumed. Instead,
practical military objectives guided technological development down particular
channels, increased its speed, and helped shape the structure of the emerging
computer industry. I will also argue, however, that the social relations between
military agencies and civilian researchers were by no means one-sided. More
often than not it was civilians, not military planners, who pushed the
application of computers to military problems. Together, in the context of the
Cold War, they enrolled computers as supports for a far-reaching discourse of
centralized command and control -- as an enabling, infrastructural technology
for the closed-world political vision.

The Background: Computers in World War II

During World War II, virtually all computer research (like most scientific
research and development) was funded directly by the War Department as part of
the war effort. But there are particularly intimate links between early digital
computer research, key military needs, and the political fortunes of science and
engineering after the war. These connections had their beginnings in problems
of ballistics.

One of the Allies’ most pressing problems in World War II was the feeble
accuracy of antiaircraft guns. Airplanes had evolved enormously since World

2 See Wiebe E. Bijker, “The Social Construction of Bakelite: Toward a Theory of Invention,” in
Bijker, Hughes, and Pinch, The Social Construction of Technology, 159–187.
War I, gaining speed and maneuverability. Defense from devastating bombing
raids depended largely on ground-based antiaircraft weapons. But judging how
far ahead of the fast-moving, rapidly turning planes to aim their guns was a task
beyond the skills of most gunners. Vast amounts of ammunition were expended
to bring down a distressingly small number of enemy bombers. The German V-I
“buzz bombs” that attacked London in 1944 made a solution even more urgent.
The problem was solved by fitting the guns with “servomechanisms” -- which
combine a kind of mechanical or electro-mechanical analog computer with a
control mechanism -- able to calculate the plane’s probable future position.3

Building these devices, called “gun directors,” required trajectory tables in


which relations between variables such as the caliber of the gun, the size of its
shell, and the character of its fuse were calculated out. Ballistics calculations of
this sort have a long history in warfare, dating almost to the invention of
artillery. Galileo, for example, invented and marketed a simple calculating aid
called a “gunner’s compass” that allowed artillerymen to measure the angle of a
gun and compute, on an ad hoc basis, the amount of powder necessary to fire a
cannonball a given distance.4 As artillery pieces became increasingly powerful
and complex, precalculated ballistics tables became the norm. The computation
of these tables grew into a minor military industry. During World War I, young
mathematicians such as Norbert Wiener and Oswald Veblen worked on these
problems at the Army’s Aberdeen Proving Ground. Such mathematicians were
called “computers.”5

In World War II, with its constant and rapid advances in gunnery,
Aberdeen’s work became a major bottleneck in fielding new artillery and

3 Mechanical systems are classical Aristotelian machines (e.g., car engines) that perform work
using the physical movement of levers, gears, wheels, etc. Electro-mechanical systems are
machines powered by electric motors or electromagnets (e.g., vacuum cleaners), but part or all of
whose function is still performed through the physical movement of parts. Electronic systems, by
contrast, contain few or no moving parts. They consist entirely of electrical circuitry and perform
their work through transformations of electric current (e.g., televisions or stereo systems).
The distinction between d i g i t a l and analog methods corresponds closely to the more
intuitive difference between counting and measuring. Digital calculation uses discrete states, such
as the ratchet-like detents of clockwork gears (mechanical), the on-off states of relays (electro-
mechanical switches), or the positive or negative electrical states of transistors (electronic), to
represent discrete numerical values (1, 2, 3, etc.). These values can then be added, subtracted, and
multiplied, essentially by a process of counting. Analog calculation, by contrast, employs
continuously variable states, such as the ratio between the moving parts of a slide rule
(mechanical), the speed of a motor’s rotation (electro-mechanical), or the voltage of a circuit
(electronic), to represent continuously variable numerical quantities (e.g., any value between 0 and
10). These quantities can then be physically combined to represent addition, subtraction, and
multiplication, for example as someone might measure the perimeter of a room by cutting pieces of
string to the length of each wall and then tying them together.
4 Michael R. Williams, A History of Computing Technology (Englewood Cliffs, NJ: Prentice-Hall,
1985), 78–83.
5 Pesi R. Masani, Norbert Wiener, 1894–1964 (Boston: Birkhäuser, 1990), 68.
antiaircraft systems. Both Wiener and Veblen -- by then distinguished professors
at MIT and Princeton respectively -- once again made contributions. Wiener
worked on the antiaircraft gunnery problem at its most general level. His
wartime studies culminated in the theory of cybernetics (a major precursor of
cognitive psychology). Veblen returned to Aberdeen’s ballistics work as head of
the scientific staff of the Ballistics Research Laboratory (BRL). Just as in World
War I, Veblen’s group employed hundreds of people, this time mostly women,
to compute tables by hand using desk calculators. These women, too, were called
“computers.” Only later, and gradually, was the name transferred to the
machines.6

But alongside them Aberdeen also employed the largest analog calculator
of the 1930s: the differential analyzer, invented by MIT electrical engineer
Vannevar Bush.

Vannevar Bush: Creating an Infrastructure for Scientific Research

Bush invented the differential analyzer at MIT in 1930 to assist in the solution of
equations associated with large electric power networks. The machine used a
system of rotating disks, rods, and gears powered by electric motors to solve
complex differential equations (hence its name). The BRL immediately sought to
copy the device, with improvements, completing its own machine in 1935 at
Aberdeen. At the same time, another copy was constructed at the University of
Pennsylvania’s Moore School of Engineering in Philadelphia, this one to be used
for general-purpose engineering calculation. The Moore School’s 1930s
collaboration with the BRL, each building a differential analyzer under Bush’s
supervision, was to prove extremely important. During World War II, the two
institutions would collaborate again to build the ENIAC, America’s first full-scale
electronic digital computer.

Bush was perhaps the single most important figure in American science
during World War II, not because of his considerable scientific contributions but
because of his administrative leadership. As war approached, Bush and some of
his distinguished colleagues had used their influence to start organizing the
scientific community for the coming effort. After convincing President

6 The transfer of the name “computer” to the machine was by no means immediate. The Reader’s
Guide to Periodical Literature does not list “computer” as a heading until 1957. Most news articles
of the 1945–1955 period place the word, if they use it at all, in scare quotes. See Paul E. Ceruzzi,
“When Computers Were Human,” Annals of the History of Computing, Vol. 13, No. 3 (1991),
237–244; Henry S. Tropp et al., “A Perspective on SAGE: Discussion,” Annals of the History of
Computing, Vol. 5, No. 4 (1983), 375–398; Jean J. Bartik and Frances E. Holberton, interviewed by
Richard R. Mertz, 5/28/70, Smithsonian Computer Oral History, AC NMAH #196 (Archive Center,
National Museum of American History, Washington, D.C.).
Roosevelt that close ties between the government and scientists would be critical
to this war, they established the National Defense Research Committee (NDRC)
in 1940, with Bush serving as chair. When the agency’s mandate to conduct
research but not development on weapons systems proved too restrictive, Bush
created and took direction of an even larger organization, the development-
oriented Office of Scientific Research and Development (OSRD), which
subsumed the NDRC.7 The OSRD coordinated and supervised many of the huge
science and engineering efforts mobilized for World War II. By 1945 its annual
spending exceeded $100 million; the prewar total for military R&D had been
about $23 million.8

Academic and industrial collaboration with the military under the OSRD
was critically important in World War II. Research on radio, radar, the atomic
bomb, submarines, aircraft, and computers all moved swiftly under its
leadership. Bush’s original plans called for a decentralized research system in
which academic and industrial scientists would remain in their home
laboratories and collaborate at a distance. As the research effort expanded,
however, this approach became increasingly unwieldy, and the OSRD moved
toward a system of large central laboratories.

Contracts with universities varied, but under most of them the university
provided laboratory space, management, and some of the scientific personnel for
large, multidisciplinary efforts. The Radio Research Laboratory at Harvard
employed six hundred people, more of them from California institutions than
from Harvard itself. MIT’s Radiation Laboratory, the largest of the university
research programs, ultimately employed about four thousand people from sixty-
nine different academic institutions.9 Academic scientists went to work for
industrial and military research groups, industrial scientists assisted universities,
and the military’s weapons and logistics experts and liaison officers were
frequent visitors to every laboratory. The war effort thus brought about the most
radical disciplinary mixing, administrative centralization, and social
reorganization of science and engineering ever attempted in the United States.

It would be almost impossible to overstate the long-term effects of this


enormous undertaking on American science and engineering. The vast
interdisciplinary effort profoundly restructured scientific research communities.
It solidified the trend to science-based industry -- already entrenched in the

7 On the NDRC and OSRD, see Daniel Kevles, The Physicists (New York: Alfred A. Knopf, 1971);
David Dickson, The New Politics of Science (New York: Pantheon, 1984); and Bruce L. R. Smith,
American Science Policy since World War II (Washington DC: Brookings Institution, 1991).
8 Paul Forman, “Behind Quantum Electronics: National Security as Basis for Physical Research in
the United States, 1940–1960,” Historical Studies in the Physical and Biological Sciences, Vol. 18,
No. 1 (1987), 152.
9 James Phinney Baxter, Scientists Against Time (Boston: Little, Brown, 1946), 21. On the Radiation
Laboratory see also Karl L. Wildes and Nilo A. Lindgren, A Century of Electrical Engineering and
Computer Science at MIT, 1882–1982 (Cambridge, MA: MIT Press, 1985).
interwar years -- but it added the new ingredient of massive government
funding and military direction. MIT, for example, “emerged from the war with a
staff twice as large as it had had before the war, a budget (in current dollars) four
times as large, and a research budget ten times as large -- 85 percent from the
military services and their nuclear weaponeer, the AEC.”10 Eisenhower
famously named this new form the “military-industrial complex,” but the nexus
of institutions is better captured by the concept of the “iron triangle” of self-
perpetuating academic, industrial, and military collaboration.11

Almost as important as the institutional restructuring was the creation of


an unprecedented experience of community among scientists and engineers.
Boundaries between scientific and engineering disciplines were routinely
transgressed in the wartime labs, and scientists found the chance to apply their
abilities to create useful devices profoundly exciting. For example, their work on
the Manhattan Project bound the atomic physicists together in an intellectual
and social brotherhood whose influence continued to be felt into the 1980s.
Radiation Laboratory veterans protested vigorously when the lab was to be
abruptly shut down in December 1945 as part of postwar demobilization; they
could not believe the government would discontinue support for such a patently
valuable source of scientific ideas and technical innovations. Their outcry soon
provoked MIT, supported by the Office of Naval Research (ONR), to locate a
successor to the Rad Lab in its existing Research Laboratory of Electronics.12
Connections formed during the war became the basis, as we will see over and
over again, for enduring relationships between individuals, institutions, and
intellectual areas.

Despite his vast administrative responsibilities, Bush continued to work


on computers early in the war. He had, in fact, begun thinking in 1937–38 about a
possible electronic calculator based on vacuum tubes, a device he called the
Rapid Arithmetical Machine. Memoranda were written and a research assistant
was engaged. But Bush dropped the project as war brought more urgent needs.
His assistant, Wilcox Overbeck, continued design work on the machine, but he
too was finally forced to give up the project when he was drafted in 1942. Most of
Overbeck’s work focused on tube design, since Bush was concerned that the high
failure rates of existing vacuum tubes would render the Rapid Arithmetical
Machine too unreliable for practical use. Possibly because of this experience, Bush

10 Forman, “Behind Quantum Electronics,” 156–157.


11 See David Noble, America By Design: Science, Technology, and the Rise of Corporate
C a p i t a l i s m (Oxford: Oxford University Press, 1977), as well as Dickson, New Politics of Science,
and Merritt Roe Smith, ed., Military Enterprise and Technological Change (Cambridge, MA: MIT
Press, 1985). The phrase “iron triangle” is from Gordon Adams, The Politics of Defense Contracting:
The Iron Triangle (New Brunswick, NJ: Transaction Books, 1982).
12 Wildes and Lindgren, A Century of Electrical Engineering and Computer Science at MIT, 289 and
passim.
opposed fully electronic computer designs until well after the end of World War
II.13

Bush did, however, perfect a more powerful version of the differential


analyzer, known as the Rockefeller Differential Analyzer (after its funding
source) at MIT in 1942. This device could be programmed with punched paper
tape and had some electronic components. Though committed to analog
equipment and skeptical of electronics, he kept abreast of the Moore School’s
ENIAC project, and the universe of new possibilities opened up by computers
intrigued him.14

Thus it so happened that the figure most central to World War II science
was also the inventor of the prewar period’s most important computer
technology. Bush’s laboratory at MIT had established a tradition of analog
computation and control engineering -- not, at the time, separate disciplines -- at
the nation’s most prestigious engineering school. This tradition, as we will see,
weighed against the postwar push to build digital machines. Simultaneously,
though, the national science policies Bush helped create had the opposite effect.
The virtually unlimited funding and interdisciplinary opportunities they
provided encouraged new ideas and new collaborations, even large and
expensive ones whose success was far from certain. Such a project was the Moore
School’s Electronic Numerical Integrator and Calculator (ENIAC), the first
American electronic digital computer.

The ENIAC Project

Even with the help of Bush’s differential analyzer, compiling ballistics tables for
antiaircraft weapons and artillery involved tedious calculation. Tables had to be
produced for every possible combination of gun, shell, and fuse; similar tables
were needed for the (analog) computing bombsight and for artillery pieces. Even
with mechanical aids, human “computers” made frequent mistakes,
necessitating time-consuming error-checking routines. The BRL eventually
commandeered the Moore School’s differential analyzer as well. Still, with two
of these machines, the laboratory fell further and further behind in its work.

13 Ibid., 229. J.C.R. Licklider speculated that Bush’s dislike of digital machines stemmed from an
argument with Norbert Wiener over whether binary arithmetic was the best base to use. See John
A. N. Lee and Robert Rosin, “The Project MAC Interviews,” IEEE Annals of the History of
Computing, Vol. 14, No. 2 (1992), 22.
14 See, for example, Bush’s prescient article on the “memex,” a kind of hypertext technology, in “As
We May Think,” Atlantic Monthly 176, 1945. Reprinted in Zenon Pylyshyn, ed., Perspectives on the
Computer Revolution (Englewood Cliffs: Prentice-Hall, 1970), 47–59.
“The automation of this process was . . . the raison d’être for the first
electronic digital computer,” wrote Herman Goldstine, co-director of the ENIAC
project. The best analog computers, even those built during the war, were only
“about 50 times faster than a human with a desk machine. None of these [analog
devices were] sufficient for Aberdeen’s needs since a typical firing table required
perhaps 2,000–4,000 trajectories. . . . The differential analyzer required perhaps
750 hours -- 30 days -- to do the trajectory calculations for a table.”15 (To be
precise, however, these speed limitations were due not to the differential
analyzer’s analog characteristics, but to its electro-mechanical nature. Electronic
equipment, performing many functions at the speed of light, could be expected to
provide vast improvements. As Bush’s RDA had demonstrated, electronic
components could be used for analog as well as digital calculation. Thus nothing
in Aberdeen’s situation dictated a digital solution to the computation bottleneck.)

The Moore School started research on new ways of automating the


ballistics calculations, under direct supervision of the BRL and the Office of the
Chief of Ordnance. In 1943 Moore School engineers John Mauchly and J. Presper
Eckert proposed the ENIAC project. They based its digital design in part on
circuitry developed in the late 1930s by John Atanasoff and Clifford Berry of the
Iowa State College. (Atanasoff and Berry, however, never pursued their designs
beyond a small-scale prototype calculator, conceiving it, as did most engineers of
their day, more as a curiosity of long-term potential than as an immediate
alternative to existing calculator technology.)16 The BRL, at this point desperate
for new assistance, approved the project over the objections of Bush, who
thought the electronic digital design infeasible.

The ENIAC represented an electrical engineering project of a completely


unprecedented scale. The machine was about 100 times larger than any other
existing electronic device, yet to be useful it would need to be at least as reliable as
far smaller machines. Calculations revealed that because of its complexity, the
ENIAC would have to operate with only one chance in 1014 of a circuit failure in
order to function continuously for just twelve hours. Based on these estimates,
some of ENIAC’s designers predicted that it would operate only about 50 percent
of the time, presenting a colossal maintenance problem, not to mention a
challenge to operational effectiveness.17

15 Herman Goldstine, The Computer from Pascal to von Neumann (Princeton, NJ: Princeton
University Press, 1972), 135–136. As noted in chapter 1, the British Colossus, not the ENIAC, was
actually the first electronic digital computer. Because of British secrecy, Goldstine may not have
known this when he wrote the cited lines in 1972.
16 Williams, History of Computing Technology; Stan Augarten, Bit by Bit: An Illustrated History
of Computers (New York: Ticknor & Fields, 1984); 119–120.
17 Williams, History of Computing Technology, 275. The estimate of part failure probability is due
to Herman Goldstine.
When completed in 1945, the ENIAC filled a large room at the Moore
School with equipment containing 18,000 vacuum tubes, 1500 relays, 70,000
resistors, and 10,000 capacitors. The machine consumed 140 kilowatts of power
and required internal forced-air cooling systems to keep from catching fire. The
gloomy forecasts of tube failure turned out to be correct, in one sense: when the
machine was turned on and off on a daily basis, a number of tubes would burn
out almost every day, leaving it nonfunctional about 50 percent of the time, as
predicted. Most failures, however, occurred during the warm-up and cool-down
periods. By the simple (if expensive) expedient of never turning the machine off,
the engineers dropped the ENIAC’s tube failures to the more acceptable rate of
one tube every two days.18

The great mathematician John von Neumann became involved with the
ENIAC project in 1944, after a chance encounter with Herman Goldstine on a
train platform. By the end of the war, with Eckert, Mauchly, and others, von
Neumann had planned an improved computer, the EDVAC. The EDVAC was
the first machine to incorporate an internal stored program, making it the first
true computer in the modern sense.19 (The ENIAC was programmed externally,
using switches and plugboards.) The plan for the EDVAC’s logical design served
as a model for nearly all future computer control structures -- often called “von
Neumann architectures” -- until the 1980s.20

Initially budgeted at $150 thousand, the ENIAC finally cost nearly half a
million dollars. Without the vast research funding and the atmosphere of
desperation associated with the war, it probably would have been years, perhaps
decades, before private industry attempted such a project. The ENIAC became,
like radar and the bomb, an icon of the miracle of government-supported “big
science.”

18 Ibid., 285.
19 In contemporary parlance the word “computer” refers to electronic digital machines with a
memory and one or more central processing units. In addition, to qualify as a computer a device must
be capable of (a) executing conditional branching (i.e., carrying out different sets of instructions
depending upon the results of its own prior calculations) and (b) storing these instructions
(programs) internally. Babbage’s Analytical Engine had most of these features but would have
stored its programs externally on punched wooden cards. Somewhat like the Analytical Engine, the
EDVAC was only partially completed; thus it represents the first true computer design but not the
first actually operating computer. Instead, the British Manchester University Mark I achieved
that honor in 1948. See Williams, History of Computing Technology, 325; Augarten, Bit by Bit, 149.
20 Goldstine distributed this plan, under von Neumann’s name but unbeknownst to him, as the
famous and widely read “Draft Report on the EDVAC.” Because von Neumann’s name was on its
cover, the misunderstanding arose that he was the report’s sole author. But many of the Draft
Report’s key ideas actually originated with Eckert, Mauchly, and other members of the ENIAC
design team. Because of this misunderstanding, which later escalated into a lawsuit, and the fame
he acquired for other reasons, von Neumann has received more credit for originating computer
design than he probably deserved. See Augarten, Bit by Bit, 136ff. The most essential feature of the
so-called von Neumann architecture is serial (one-by-one) processing of the instruction stream.
The ENIAC was not completed until the fall of 1945, after the war had
ended. The ballistics tables ENIAC was built to compute no longer required
urgent attention. But the ENIAC was a military machine, and so it was
immediately turned to the military ends of the rapidly emerging Cold War. The
first problem programmed on the machine was a mathematical model of a
hydrogen bomb from the Los Alamos atomic weapons laboratories. The ENIAC,
unable to store programs or retain more than twenty ten-digit numbers in its
tiny memory, required several weeks in November 1945 to run the program in a
series of stages. The program involved thousands of steps, each individually
entered into the machine via its plugboards and switches, while the data for the
problem occupied one million punch cards. The program’s results exposed
several problems in the proposed H-bomb design. The director of Los Alamos
expressed his thanks to the Moore School in March 1946, writing that “the
complexity of these problems is so great that it would have been impossible to
arrive at any solution without the aid of ENIAC.”21

This event was symbolic of a major and portentous change. The wartime
alliance of academic and industrial science with the military had begun as a
temporary association for a limited purpose: winning a war against aggressors.
Now it was crystallizing into a permanent union.

At the formal dedication ceremony on February 15, 1946, just before


pressing a button that set the ENIAC to work on a new set of hydrogen bomb
equations, Major General Gladeon Barnes spoke of “man’s endless search for
scientific truth.” In turning on the ENIAC, he said he was “formally dedicating
the machine to a career of scientific usefulness.”22 Barnes, like many others in
the aftermath of World War II, failed to find irony in the situation: that the
“scientific truth” the ENIAC began to calculate was the basis for ultimate
weapons of destruction.

Directing Research in the Postwar Era

As the postwar Truman administration began to tighten the strings of the


virtually unlimited wartime purse, expectations in many quarters were that, like

21 Augarten, Bit by Bit, 130–131, citing Nancy Stern, From ENIAC to UNIVAC (Boston: Digital
Press, 1981). Whether the ENIAC was actually able to solve this first problem is a matter of
debate among historians. Kenneth Flamm, citing an interview with Stanley Frankel (Smithsonian
Institution Computer Oral History Project, October 5, 1972, conducted by Robina Mapstone), holds
that it failed and that the calculations were actually carried out later at Eckert and Mauchly’s
UNIVAC factory in Philadelphia. See Kenneth Flamm, Targeting the Computer: Government
Support and International Competition (Washington, DC: Brookings Institution, 1987), 79.
22 Barnes, cited in Goldstine, The Computer from Pascal to von Neumann, 229.
the armed forces, the enormous scientific and engineering network assembled
for the war effort would be demobilized and thrown on its own resources.

For a number of reasons, the looming fiscal constraints never


materialized. Postwar federal expenditures for R&D remained far higher than
before the war, with most of the money channeled through the armed forces.

In [fiscal year] 1938 the total U.S. budget for military research and
development was $23 million and represented only 30 percent of
all Federal R&D; in fiscal 1945 the OSRD alone spent more than
$100 million, the Army and Navy together more than $700
million, and the Manhattan Project more than $800 million. . . .
In the immediate postwar years total military expenditure
slumped to a mere seven times its prewar constant-dollar level,
while constant-dollar military R&D expenditure held at a full 30
times its prewar level, and comprised about 90 percent of all
federal R&D. In the early 1950s total military expenditure soared
again, reaching 20 times its prewar constant-dollar level, while
military R&D reattained, and before the end of the decade much
surpassed, its World War II high.23

Industrial R&D expenditures soared as well. By the late 1940s the total
amount of industrial R&D roughly equaled that sponsored by the federal
government, but as figure 2.1 shows, a decade later government R&D spending
was nearly double that of industry.

This trend, and the politics it reflected, resulted from the three concurrent
developments in postwar America politics. First, in the rapid transition from
World War II to the Cold War, the war’s key events served as anchoring icons
for postwar policies. Wartime institutions became blueprints for their postwar
counterparts. Second, the emerging politico-military paradox of a peacetime Cold
War generated a perceived need for new technology, justifying vast military
investments in research. Finally, fierce public debates about postwar federal
support for science and technology had ended in stalemate. Plans for a National
Science Foundation suffered long delays, and military agencies were left to fill
the resulting vacuum. Let us explore each of these developments in turn.

Transference and Apocalypse

23 Forman, “Behind Quantum Electronics,” 152.


World War II was “the good war,” a war not only against greedy, power-hungry
aggressors but against an inhumane, antidemocratic ideology. This nearly
universal sentiment was vindicated and vastly amplified by postwar revelations
of the horrors of the Nazi concentration camps.

Soviet maneuverings in Eastern Europe, as well as the openly


expansionist Soviet ideology, provided grounds for the transition into a Cold
War. Stalin was rapidly equated with Hitler. The closed nature of Soviet society
added a sinister force to mounting rumors of purges and gulag atrocities. In the
eyes of many Americans, communism replaced fascism as an absolute enemy. It
was seen (and saw itself) not just as one human order among others, but as an
ultimate alternative system, implacably opposed to Western societies in virtually
every arena: military, political, ideological, religious, cultural, economic. The
absoluteness of the opposition allowed the sense of an epic, quasi-Biblical
struggle that surrounded the fight against Nazism and fascism not only to
survive but to thrive.

This transference of attitudes from World War II to the Cold War


included a sense of a global, all-encompassing, apocalyptic conflict. In the 1940s
and 1950s the partitioning of Europe, revolutionary upheavals across the
postcolonial world, and the contest for political and ideological alliances
throughout Europe, Asia, and the Middle East encouraged American perceptions
that the world’s future balanced on a knife edge between the United States and
the USSR. Truman’s declaration of worldwide American military support for
“free peoples who are resisting attempted subjugation by armed minorities or by
outside pressures” codified the continuation of global conflict on a permanent
basis and elevated it to the level of a universal struggle between good and evil,
light and darkness, freedom and slavery.24

The only combatant nation to emerge from the war intact, the United
States had simultaneously left behind the economic depression and the political
isolationism of the 1930s. The magnitude of this change cannot be
overemphasized, since we have become used to a very different world.25 The
United States was a world industrial power before the Great Depression, but with
a few brief exceptions had played only a minor role in world political and
military affairs during a period when the European colonial empires still ruled
the globe. As late as 1939, the U.S. army numbered only 185,000 men, with an
annual budget under $500 million. America maintained no military alliances

24 Harry S Truman, cited in Ambrose, Rise to Globalism, 86.


25 By the Cold War’s end in 1989, peacetime military budgets routinely reached levels in excess of
$300 billion. America had 1.5 million men and women in uniform, defense alliances with 50 nations,
and military bases in 117 countries.
with any foreign country.26 Six years later, at the end of World War II, the
United States had over 12 million men under arms and a military budget
swollen to 100 times its prewar size. In addition,

the U.S. was producing 45 percent of the world’s arms and nearly
50 percent of the world’s goods. Two-thirds of all the ships afloat
were American built. . . . The conclusion of the war . . . found the
U.S. either occupying, controlling, or exerting strong influence
in four of the five major industrial areas of the world -- Western
Europe, Great Britain, Japan, and the U.S. itself. Only the Soviet
Union operated outside the American orbit. . . . The U.S. was the
only nation in the world with capital resources available to solve
the problems of postwar reconstruction.27

The old colonial empires were bankrupt and on the verge of


disintegration, the imperial pretensions of Japan had been smashed, and the
Soviet Union, though still powerful, had suffered staggering losses. Postwar
public sentiment for a return to the isolationism of the 1930s was strong, as was
fear of renewed economic depression. The Truman administration initially
tended to honor these worries with its heavy focus on a balanced budget and a
rapid military demobilization. But the transference of World War II’s apocalyptic
struggles into the postwar world, the sense of America’s awesome power, the fear
of future nuclear war, and the need to re-establish war-torn nations as markets
for American goods -- to stave off the feared depression -- combined to render
isolationism untenable. The postwar geopolitical situation thus catapulted the
United States into a sudden and unaccustomed role as world leader.

America’s leaders in the postwar world had been weaned on the


isolationist worldview. Except for a brief period after World War I, the United
States had never before played a controlling role in world affairs. Thus the war
itself provided the only immediately available models for action. Key events of
World War II became basic icons in the organization of American foreign policy
and military strategy:

• The 1938 Munich accords, in which Great Britain and France


handed over parts of Czechoslovakia to Hitler in a futile
attempt to stave off war, symbolized the danger of
appeasement.

26 Ambrose, Rise to Globalism, xiii.


27 Ibid., 30, 53.
• The Maginot Line -- a chain of massive fortresses along the
French-German border that the Germans had avoided by the
simple maneuver of invading through Belgium --
represented the foolhardiness of a defensive grand strategy.

• Pearl Harbor, where unopposed Japanese aircraft destroyed or


disabled a significant portion of the U.S. Pacific fleet, signified
the perpetual danger of surprise attack.

• Radar (and MIT’s Radiation Laboratory, which led wartime


radar research) and the Manhattan Project came to represent
the power of organized science to overcome military odds
with ingenuity.

• The atomic bomb itself, credited with the rapid end to the war
in the Pacific, became the enigmatic symbol of both invincible
power and global holocaust.

The unfolding political crises of the Cold War were invariably interpreted
in these terms.28 For example, the Berlin blockade was perceived as another
potential Munich, calling for a hard-line response rather than a negotiation.
Truman interpreted Korea through the lens of World War II: “Communism was
acting in Korea just as Hitler, Mussolini and the Japanese had acted. . . . I felt
certain that if South Korea was allowed to fall, Communist leaders would be
emboldened to override nations closer to our own shores.”29 Critics of the 1950s
characterized continental air defense as a Maginot Line strategy for starry-eyed
technological optimists. U.S. forward basing of nuclear weapons, in positions
vulnerable to surprise air attack, was likened to the risk of another Pearl
Harbor.30 The Manhattan Project was invoked endlessly to rally support for
major R&D projects such as the space program. Finally, the growing nuclear
arsenal was a reminder of Hiroshima, both horror and symbol of ultimate
power, and it was simply assumed (for a while) that no nation would be willing
to stand up to a weapon of such destructive force.31

28 Rystad has named this interpretation the “Munich paradigm.” See Göran Rystad, Prisoners of
the Past? (Lund, Sweden: CWK Gleerup, 1982), 33 and passim.
29 Harry S Truman, Memoirs: Years of Trial and Hope, Vol. 2 (Garden City, NJ: Doubleday, 1956),
332–333.
30 Herbert York, Los Alamos physicist and Eisenhower advisor, noted that “because of Pearl
Harbor you didn’t have to discuss the notion of surprise attack. It was in your bones that the
Russians are perfidious, that surprise attack is the way wars always start, and that appeasement
doesn’t work.” Cited in Gregg Herken, Counsels of War (New York: Knopf, 1983), 125.
31 Atomic bombs, of course, were only one in a long line of weapons Americans (and others) believed
would make war too terrible to fight. See Franklin, War Stars.
Thus in many respects the Cold War was not a new conflict with
communism but the continuation of World War II, the transference of that
mythic, apocalyptic struggle onto a different enemy.32

American Antimilitarism and a High-Technology Strategy

The authors of the U.S. Constitution feared professional armies as dangerous


concentrations of unaccountable state power. They saw the career officer corps,
which in European armies maintained the hereditary linkage between royalty,
gentry, and control of the armed forces, as a linchpin of aristocracy. In addition
military social structure, with its strict hierarchy and its authoritarian ethic,
seemed the antithesis of a participatory democracy.33

But having won their independence in a revolutionary war, the founders


naturally also understood the importance of military power in international
politics. The constitutional provision for a citizen army linked the right to
participate in government with the responsibility to protect it by force of arms.
Every citizen was a potential soldier, but every soldier was also a citizen; thus, in
principle, loyalty to military institutions was subordinated to loyalty to state and
civil society. In practice, until World War II, this also meant that armed forces
were mustered only for war and were greatly reduced once war ended. American
political discourse still reflects this ambivalence toward military forces and the
conflictual relationship between democratic ideals and military principles of
authority.34

American antimilitarism, then, is not at all the same thing as pacifism, or


principled objection to armed force itself. Instead, antimilitarism is an instance of
what the political scientist Samuel Huntington calls the “anti-power ethic” in
American society, the enormous value this society has always placed on political
limits to power, hierarchy, and authority.35

In the postwar years a number of factors contributed to a changed


perception of the need for a powerful armed force in peacetime. The absolute

32 Baritz, B a c k f i r e, makes a compelling case for a similar argument.


33 On military social and political relations in the pre-revolutionary era, see Maury D. Feld, T h e
Structure of Violence: Armed Forces as Social Systems (Beverly Hills: Sage, 1977). The tension
between the authoritarian military ethic and democratic principles is the focus of a number of
novels and films, such as Charles Nordhoff’s Mutiny on the Bounty (Boston: Little, Brown, 1932)
and Rob Reiner’s 1992 film A Few Good Men, based on Aaron Sorkin’s play of the same title.
34 I owe this point to Robert Meister.
35 See Samuel Huntington, American Politics: The Promise of Disharmony (Cambridge: Belknap
Press, 1981).
Allied victory supported a vast new confidence in the ability of military force to
solve political problems. The occupation of Germany and Japan meant an
ongoing American military presence on other continents. The relative
insignificance of American suffering in the war produced an inflated sense of the
ease of military victory -- the idea that the United States, at least, could buy a lot
for a little with military power and new technology. Also, with America’s full-
blown emergence as a world economic power came new interests across the
globe, interests that could conceivably require military defense. Finally, the rapid
transition from World War II into the Cold War left little time for a
retrenchment into prewar values: the apocalyptic conflict simply continued.

Furthermore, technological factors such as the bomb and the maturation


of air warfare now made it possible to conceive of a major military role for the
United States outside its traditional North American sphere of influence.
Historically, ocean barriers had separated the United States from the other
nations possessing the technological wherewithal to mount a serious military
challenge. These were now breached. Airplanes and, later, guided missiles could
pose threats at intercontinental range. In effect, the very concept of national
borders was altered by these military technologies: the northern boundary of the
United States, in terms of its defense perimeter, now lay at the limits of radar
vision, which in the 1950s rapidly moved northward to the Arctic Circle.

Antimilitarism, because it required that the number of men under arms


be minimized, also helped to focus strategic planning on technological
alternatives. The Strategic Air Command came to dominate U.S. strategic
planning because it controlled the technological means for intercontinental
nuclear war. It was the primary threat America could wield against the Soviet
Union, yet it required mainly money and equipment, not large numbers of
troops. The Army’s massive manpower seemed less impressive, less necessary,
and more of a political liability in the face of the minimally manned or even
automated weapons of the Air Force. As the Soviet Union acquired long-range
bombers, nuclear weapons, and then ICBMs, the role of the Air Force and its
technology in both defense and offense continued to expand.

The Cold War marked the first time in its history that America
maintained a large standing army in peacetime. But its geographical situation of
enormous distance from its enemies, combined with its antimilitarist ethic,
ensured that the institutional form taken by a more vigorous American military
presence would differ from the more traditional European and Soviet
approaches of large numbers of men under arms. Instead of universal
conscription, the United States chose the technological path of massive, ongoing
automation and integration of humans with machines. First Truman and then
Eisenhower, each balancing the contradictory goals of an expanding, activist
global role and a contracting military budget, relied ever more heavily on
nuclear weapons. By the end of the 1950s high technology -- smaller bombs with
higher yields, tactical atomic warheads for battlefield use, bombers of increasingly
long range, high-altitude spy planes, nuclear early warning systems, and rockets
to launch spy satellites and ICBMs -- had became the very core of American
global power.

Support for Research and Development

In his famous 1945 tract Science: The Endless Frontier, composed at President
Roosevelt’s request as a blueprint for postwar science and technology policy,
Vannevar Bush called for a civilian-controlled National Research Foundation to
preserve the government-industry-university relationship created during the
war. In his plea for continuing government support, Bush cited the Secretaries of
War and Navy to the effect that scientific progress had become not merely
helpful but utterly essential to military security for the United States in the
modern world:

This war emphasizes three facts of supreme importance to


national security: (1) Powerful new tactics of defense and offense
are developed around new weapons created by scientific and
engineering research; (2) the competitive time element in
developing those weapons and tactics may be decisive; (3) war is
increasingly total war, in which the armed services must be
supplemented by active participation of every element of the
civilian population.

To insure continued preparedness along farsighted


technical lines, the research scientists of the country must be
called upon to continue in peacetime some substantial portion
of those types of contribution to national security which they
have made so effectively during the stress of the present war.36

Bush’s MIT colleague Edward L. Bowles, Radiation Laboratory


“ambassador” to government and the military, advocated an even tighter
connection. Bowles wrote of the need to “systematically and deliberately couple”

36 Secretaries
of War and Navy, joint letter to the National Academy of Sciences, cited in
Vannevar Bush, Science: The Endless Frontier (Washington: U.S. Government Printing Office,
1945), 12.
scientific and engineering schools and industrial organizations with the military
forces “so as to form a continuing, working partnership.”37

Bush also argued that modern medicine and industry were also
increasingly dependent on vigorous research efforts in basic science. The massive
funding requirements of such research could not be met by the cash-poor
academic community, while the industrial sector’s narrow and short-term goals
would discourage it from making the necessary investment. Consequently the
new foundation Bush proposed would have three divisions, one for natural
sciences, one for medical research, and one for national defense.

Bush’s efforts were rebuffed, at first, by Truman’s veto of the bill


establishing the NSF. The populist president blasted the bill, which in his view
“would . . . vest the determination of vital national policies, the expenditure of
large public funds, and the administration of important government functions
in a group of individuals who would be essentially private citizens. The
proposed National Science Foundation would be divorced from . . . control by
the people.”38

With major research programs created during the war in jeopardy, the
War Department moved into the breach, creating the Office of Naval Research in
1946. In a pattern repeated again and again during the Cold War, national
security provided the consensual justification for federally funded research. The
ONR, conceived as a temporary stopgap until the government created the NSF,
became the major federal force in science in the immediate postwar years and
remained important throughout the 1950s. Its mandate was extremely broad: to
fund basic research (“free rather than directed research”), primarily of an
unclassified nature.39

Yet the ONR’s funding was rarely, if ever, a purely altruistic activity.40 The
bill creating the office mentions the “paramount importance [of scientific
research] as related to the maintenance of future naval power, and the
preservation of national security”; the ONR’s Planning Division sought to
maintain “listening posts” and contacts with cutting-edge scientific laboratories

37 Bowles, cited in Wildes and Lindgren, A Century of Electrical Engineering and Computer Science
at MIT, 203.
38 Harry S Truman, cited in James L. Penick Jr. et al., The Politics of American Science: 1939 to the
Present, revised ed. (Cambridge, MA: MIT Press, 1972), 20.
39 The Vinson Bill creating the ONR, cited in ibid., 22.
40 For a somewhat different view of the ONR, see Harvey Sapolsky, Science and the Navy
(Princeton, NJ: Princeton University Press, 1990). Among the best, and certainly to date the most
detailed, discussions of the mixed motives of military support for postwar academic research is
Stuart W. Leslie, The Cold War and American Science: The Military-Industrial-Academic
Complex at MIT and Stanford (New York: Columbia University Press, 1993).
for the Navy’s possible use.41 Lawmakers were well aware that the ONR
represented a giant step down the road to a permanent federal presence in
science and engineering research and a precedent for military influence. House
Committee on Naval Affairs Chairman Carl Vinson opposed the continuing
executive “use of war powers in peacetime,” forcing the Navy to go to Congress
for authorization.42

By 1948 the ONR was funding 40 percent of all basic research in the United
States; by 1950 the agency had let more than 1,200 separate research contracts
involving some 200 universities. About half of all doctoral students in the
physical sciences received ONR support.43 ONR money proved especially
significant for the burgeoning field of computer design. It funded a number of
major digital computer projects, such as MIT’s Whirlwind, Raytheon’s
Hurricane, and Harvard’s Mark III.44 The NSF, finally chartered in 1950 after
protracted negotiations, did not become a significant funding source for
computer science until the 1960s (in part because computer science did not
become an organized academic discipline until then). Even after 1967, the only
period for which reliable statistics are available, the NSF’s share of total federal
funding for computer science hovered consistently around the 20 percent mark,
while DoD obligations ranged between 50 and 70 percent, or 60 to 80 percent if
military-related agencies such as the Department of Energy (responsible for
atomic weapons research) and NASA (whose rockets lifted military surveillance
satellites and whose research contributed to ballistic missile development) are
included.45

The Military Role in Postwar Computer Research

41 The Vinson Bill and the ONR Planning Division, cited in Penick et al., The Politics of American
Science, 22–23.
42 Vinson, quoted in Kent C. Redmond and Thomas M. Smith, Project Whirlwind (Boston: Digital
Press, 1980), 105.
43 Dickson, New Politics of Science, 118–119.
44 See Mina Rees, "The Computing Program of the Office of Naval Research, 1946–1953," Annals of
the History of Computing, Vol. 4, No. 2 (1982), 103–113.
45 The figure of 20 percent for NSF support is generous, since the budget category used includes both
mathematics and computer science research. On the DOE and NASA as auxiliary military
agencies, see Flamm, Targeting the Computer, 46 and passim. My discussion in this chapter relies
heavily on Flamm’s published account; on some points I am indebted to him for personal
communications as well. On NASA’s role as a civilian cover for military research and the U.S.
geostrategic aim of establishing international rights of satellite overflight (the “open skies”
policy) in order to obtain intelligence about Soviet military activities, see Walter A. McDougall,
...the Heavens and the Earth: A Political History of the Space Age (New York: Basic Books, 1985).
With the war’s end, some corporate funding became available for computer
research. A few of the wartime computer pioneers, such as ENIAC engineers
Mauchly and Eckert, raised commercial banners. The company they formed
developed the BINAC, the first American stored-program electronic computer,
and then the UNIVAC, the first American commercial computer.46

But military agencies continued, in one way or another, to provide the


majority of support. The Army (via the Census Bureau) and Air Force (via the
Northrop Corporation’s Snark missile project) were Eckert and Mauchly’s major
supporters. Bell Laboratories, the largest independent electronics research
laboratory in the country, saw the percentage of its peacetime budget allocated to
military projects swell from zero (prewar) to upwards of 10 percent as it
continued work on the Nike missile and other systems, many of them involving
analog computers.47 Many university-based computer researchers continued
under ONR sponsorship. Others became involved in a private company,
Engineering Research Associates (ERA), which developed cryptological
computers for its major customer, the Navy, as well as later commercial
machines based on its classified work. (When ERA’s ATLAS became operational,
in 1950, it was the second electronic stored-program computer in the United
States.)48

With military and Atomic Energy Commission support, John von


Neumann began his own computer project at the Institute for Advanced Study
(IAS). The so-called IAS machine, completed in 1952, became one of the most
influential computers of the immediate postwar period. Several copies were
built at defense research installations, including the Rand Corporation and the
Los Alamos, Oak Ridge, and Argonne National Laboratories.49

How much military money went to postwar computer development?


Because budgets did not yet contain categories for computing, an exact accounting
is nearly impossible. Kenneth Flamm has nevertheless managed to calculate
rough comparative figures for the scale of corporate and military support.50
Flamm estimates that in 1950 the federal government provided between $15 and
$20 million (current) per year, while industry contributed less than $5 million --

47 M. D. Fagen, ed., A History of Engineering and Science in the Bell System (Murray Hill, NJ: Bell
Telephone Laboratories, 1978), 11.
48 Erwin Tomash and Arnold A. Cohen, “The Birth of an ERA: Engineering Research Associates,
Inc., 1946–1955,” Annals of the History of Computing, Vol. 1, No. 2 (1979), 83–97.
49 Julian Bigelow, “Computer Development at the Institute for Advanced Study,” in N. Metropolis,
J. Howlett, and Gian-Carlo Rota, eds., A History of Computing in the Twentieth Century (New
York: Academic Press, 1980), 291–310.
50 Accounting for research and development costs is inherently problematic, for example because of
the sometimes fine line between procurement and development expenditures. Defense Department
accounting practices do not always offer a clearcut distinction between R&D and other budgets. See
Flamm, Targeting the Computer, 94.
20 to 25 percent of the total. The vast bulk of federal research funds at that time
came from military agencies.

In the early 1950s the company-funded share of R&D began to rise (to
about $15 million by 1954), but between 1949 and 1959 the major corporations
developing computer equipment -- IBM, General Electric, Bell Telephone, Sperry
Rand, Raytheon, and RCA -- still received an average of 59 percent of their
funding from the government (again, primarily from military sources). At
Sperry Rand and Raytheon, the government share during this period
approached 90 percent.51 The first commercial production computer, Remington
Rand’s UNIVAC I, embodied the knowledge Eckert and Mauchly had gained
from working on the military-funded ENIAC and later on their BINAC, which
had been built as a guidance computer for Northrop Aircraft’s Snark missile.
Though much of the funding for Eckert and Mauchly’s project was channeled
through the Census Department (which purchased the first UNIVAC I), the
funds were transferred to Census from the Army.52

Flamm also concludes that even when R&D support came primarily from
company sources, it was often the expectation of military procurements that
provided the incentive to invest. For instance, IBM’s first production computer
(the 701, also known as the “Defense Calculator”), first sold in 1953, was
developed at IBM’s expense, but only with letters of intent in hand from
eighteen DoD customers.53

Consequences of Military Support

What sort of influence did this military support have on the development of
computers? In chapters 3 and 4 we will explore this question in great detail with
respect to the Whirlwind computer, the SAGE air defense system, the Rand
Corporation, and the Vietnam War. Here, however, I will sketch some more
general answers through a series of examples.

51 Flamm, Targeting the Computer, 94–96.


52 Nancy Stern, “The BINAC: A Case Study in the History of Technology,” Annals of the History
of Computing, Vol. 1, No. 1 (1979), 9–20; Stern, From ENIAC to UNIVAC. Shipped from Eckert and
Mauchly’s Philadelphia workshop to California, the BINAC failed to function after being
reassembled. Northrop soon abandoned it in favor of analog computers.
53 Flamm, Targeting the Computer, 64, 76. The outbreak of the Korean War caused IBM chairman
Thomas J. Watson, Sr., to reactivate IBM’s military products division, providing an opportunity for
Thomas J. Watson, Jr., to initiate electronic computer development there. (See James Birkenstock,
interviewed by Erwin Tomash and Roger Stuewer, 8/12/80. Charles Babbage Institute, University
of Minnesota.)
First, military funding and purchases in the 1940s and 1950s enabled
American computer research to proceed at a pace so ferocious as to sweep away
competition from Great Britain, the only nation then in a position to become a
serious rival. At the end of World War II the British possessed the world’s only
functioning, fully electronic digital computer (Turing’s Colossus), and until the
early 1950s its sophistication in computing at least equaled that of the United
States. The Manchester University Mark I became, in June 1948, the world’s first
operating stored-program electronic digital computer (i.e., the first operating
computer in the full modern sense of the term). The Cambridge University
EDSAC, explicitly modeled on the EDVAC, preceded the latter into operation in
June 1949, “the first stored-program electronic computer with any serious
computational ability.”54 The firm of Ferranti Ltd. built the first successful
commercial computer, also called the Mark I, and eventually sold eight of these
machines, primarily to government agencies active in the British atomic
weapons research program. The first Ferranti Mark I became operational in
February 1951, preceding the Eckert/Mauchly UNIVAC by a few months.

With its financial resources limited by the severe demands of postwar


reconstruction, the British government failed to pursue the field with the
intensity of the United States. British researchers and producers were in general
left to more ordinary commercial and technical resources. By the time large-scale
commercial markets for computers developed in the early 1960s, British designs
lagged behind American models. Unable to keep up, the fledgling British
computer industry declined dramatically: though British firms totally dominated
the British market in the 1950s, by 1965 more than half of computers operating in
Britain were U.S.-made.55

Second, the military secrecy surrounding some of both British and


American research impeded the spread of the new technology. Most academic
researchers felt advances would come faster in an atmosphere of free exchange of
ideas and results. They pressed to re-establish such a climate, and in many cases --
such as that of the IAS computer, whose technical reports and plans were widely
disseminated -- they succeeded. But the wartime habits of secrecy died hard, and
in the course of the Cold War tensions between military and commercial
interests rose. In August 1947 Henry Knutson of the ONR’s Special Devices
Center informed Jay Forrester, director of the MIT Whirlwind computer project,
that “the tendency is to upgrade the classification [of military-funded research
projects] and that all computer contracts are now being reconsidered with the

54 Williams,History of Computing Technology, 334.


55 SeeFlamm, Targeting the Computer, 159 and passim and Flamm, Creating the Computer:
Government, Industry, and High Technology (Washington, DC: Brookings Institution, 1988),
136–150.
possible view of making them confidential.”56 Much of the Whirlwind work
was, in fact, classified. (Indeed, in the 1950s MIT spun off the Lincoln Laboratories
from its university operations because of the huge volume of classified research
on air defense, including computers.) In the late 1940s, Forrester sometimes had
trouble recruiting researchers because so many people refused to work on
military projects.57 John Mauchly, to cite another kind of postwar security issue,
was accused of being a communist sympathizer (he was not) and was denied a
clearance.58

Though many of the military-sponsored computer projects were not


classified in the direct sense, informal self-censorship remained a part of postwar
academic research culture. As Paul Forman has argued, “strictly speaking there
was in this [post–World War II] period no such thing as unclassified research
under military sponsorship. ‘Unclassified’ was simply that research in which
some considerable part of the responsibility for deciding whether the results
should be held secret fell upon the researcher himself and his laboratory.”
Forman cites the ONR’s Alan Waterman and Capt. R. D. Conrad, writing in 1947,
to the effect that “the contractor is entirely free to publish the results of his work,
but . . . we expect that scientists who are engaged on projects under Naval
sponsorship are as alert and as conscientious as we are to recognize the
implications of their achievement, and that they are fully competent to guard the
national interest.”59

Third, even after mature commercial computer markets emerged in the


early 1960s, U.S. military agencies continued to invest heavily in advanced
computer research, equipment, and software. In the 1960s the private sector
gradually assumed the bulk of R&D funding. IBM, in particular, adopted a
strategy of heavy investment in research, reinvesting over 50 percent of its
profits in internal R&D after 1959. The mammoth research organization IBM
built gave it the technical edge partly responsible for the company’s dominance
of the world computer market for the next two decades. To compete, other
companies eventually duplicated IBM’s pattern of internal research investment.

Despite the extraordinary vitality of commercial R&D after the early 1960s,
the Pentagon continued to dominate research funding in certain areas. For
example, almost half of the cost of semiconductor R&D between the late 1950s
and the early 1970s was paid by military sources. Defense users were first to put

56 Jay W. Forrester, “Computation Book,” entry for 8/13/47. Magnetic Core Memory Records,
1932–1977, MC 140, Box 4, F 27. (Institute Archives and Special Collections, Massachusetts Institute
of Technology, Cambridge, MA).
57 Redmond and Smith (Project Whirlwind, 75) quote a letter from Warren Weaver to Mina Rees to
the effect that MIT professor Samuel Caldwell “would work only ‘on research concerning electronic
computing that will freely serve all science,’ a view shared by many of his colleagues.”
58 On Mauchly’s security problems, see the Appendix to Augarten, Bit by Bit.
59 Forman, “Behind Quantum Electronics,” 206.
into service integrated circuits (ICs, the next major hardware advance after
transistors); in 1961, only two years after their invention, Texas Instruments
completed the first IC-based computer under Air Force contract. The Air Force
also wanted the small, lightweight ICs for Minuteman missile guidance control.
In 1965, about one-fifth of all American IC sales went to the Air Force for this
purpose. Only in that year did the first commercial computer to incorporate ICs
appear.60 ICs and other miniaturized electronic components allowed the
construction of sophisticated digital guidance computers that were small, light,
and durable enough to fit into missile warheads. This, in turn, made possible
missiles with multiple, independently-targetable reentry vehicles (MIRVs),
which were responsible for the rapid growth of nuclear destructive potential in
the late 1960s and early 1970s.61 ICs were the ancestors of today’s microprocessors
and very-large-scale integrated circuitry, crucial components of modern cruise
missiles and other “smart” weaponry.

Another instance was the nurturance of artificial intelligence (AI) by the


Advanced Research Projects Agency (ARPA, later called DARPA, the Defense
Advanced Research Projects Agency), which extended from the early 1960s until
the final end of the Cold War. AI, for over two decades almost exclusively a pure
research area of no immediate commercial interest, received as much as 80
percent of its total annual funding from ARPA. ARPA also supported such other
important innovations as timesharing and computer networking. In 1983, with
its Strategic Computing Initiative (SCI), DARPA led a concerted Pentagon effort
to guide certain critical fields of leading-edge computer research, such as artificial
intelligence, semiconductor manufacture, and parallel processing architectures,
in particular directions favorable to military goals. (We will return to ARPA and
its relationship with AI in chapters 8 and 9.)

Thus the pattern of military support has been widespread, long-lasting,


and deep. In part because of connections dating to the ENIAC and before, this
pattern became deeply ingrained in postwar institutions. But military agencies
led cutting-edge research in a number of key areas even after a commercial
industry became well established in the 1960s. As Frank Rose has written, “the
computerization of society . . . has essentially been a side effect of the
computerization of war.”62

Why Build Computers?

60 Flamm, Creating the Computer, 17–18. Also see Harry Atwater, “Electronics and Computer
Development: A Military History,” Technology and Responsibility, Vol. 1, No. 2 (1982).
61 Ted Greenwood, Making the MIRV (Cambridge: Ballinger, 1975).
62 Frank Rose, Into the Heart of the Mind (New York: Harper and Row, 1984), 36.
We have explored the origin of military support, its extent, and some of its
particular purposes. Now we must return once again to the question of posed in
the chapter title, this time at the level of more general institutional and technical
problems. Why did the American armed forces establish and maintain such an
intimate involvement with computer research?

The most obvious answer comes from the utilitarian side of the vision
captured in General Westmoreland’s “electronic battlefield” speech: computers
can automate and accelerate important military tasks. The speed and complexity
of high-technology warfare have generated control, communications, and
information analysis demands that seem to defy the capacities of unassisted
human beings. Jay Forrester, an MIT engineer who played a major role in
developing the military uses of computing, wrote that between the mid-1940s
and the mid-1950s

the speed of military operations increased until it became clear


that, regardless of the assumed advantages of human judgment
decisions, the internal communication speed of the human
organization simply was not able to cope with the pace of
modern air warfare. . . . In the early 1950s experimental
demonstrations showed that enough of [the] decision making
[process] was understood so that machines could process raw
data into final weapon-guidance instruction and achieve results
superior to those then being accomplished by the manual
systems. 63

Computers thus improved military systems by “getting man out of the


loop” of critical tasks. Built directly into weapons systems, computers assisted or
replaced human skill in aiming and operating advanced weapons, such as
antiaircraft guns and missiles. They automated the calculation of tables. They
solved difficult mathematical problems in weapons engineering and in the
scientific research behind military technologies, augmenting or replacing human
calculation. Computers began to form the keystone of what the armed forces now
call “C3I” -- command, control, communications, and intelligence (or
information) networks, replacing and assisting humans in the encoding and
decoding of messages, the interpretation of radar data, and tracking and targeting
functions, among many others.

63 Jay
W. Forrester, “Managerial Decision Making,” in Martin Greenberger, ed., Computers and the
World of the Future (Cambridge, MA: MIT Press, 1962), 53.
I will argue that this automation theory is largely a retrospective
reconstruction. In the 1940s it was not at all obvious that electronic digital
computers were going to be good for much besides exotic scientific calculations.
Herman Goldstine recalled that well into the 1950s “most industrialists viewed
[digital] computers mainly as tools for the small numbers of university or
government scientists, and the chief applications were thought to be highly
scientific in nature. It was only later that the commercial implications of the
computer began to be appreciated.”64 Furthermore, the field of analog
computation was well developed, with a strong industrial base and a well-
established theoretical grounding. Finally, analog control mechanisms
(servomechanisms) had seen major improvements during the war. They were
readily available, well-understood, and reliable.

Howard Aiken, the Harvard designer of several early digital computers,


told Edward Cannon that “there will never be enough problems, enough work,
for more than one or two of these [digital] computers,” and many others
agreed.65

Analog vs. Digital: Computers and Control

Most modern computers perform three basic types of functions: calculation,


communication, and control.66 The computers of the 1940s could not yet do this;
they were calculators, pure and simple. Their inputs and outputs consisted

64 Goldstine, The Computer from Pascal to von Neumann, 251.


65 Aiken’s Mark I project was begun in 1937 and completed in 1943. Despite its large scale, neither
this nor his subsequent computers had much influence on the main stream of computer development,
largely due to his conservative technological approach (according to Augarten, Bit by Bit). Cannon
reported Aiken’s comment in his sworn testimony for Honeywell v. Sperry Rand, p. 17935, cited in
Stern, From ENIAC to UNIVAC, 111.
66 Calculation is the mathematical and logical function: crunching numbers, analyzing data,
working with Boolean (true-false) variables. Communication includes the transfer of text, data,
and images among computers through networks, electronic mail, and fax. In control functions
computers operate other machines: telephone switching networks, automobile engines, lathes. (The
control of spacecraft from Earth provides an example of how the three functions are now integrated:
computers calculate trajectories and determine which engines to burn, for how long, to achieve
needed course corrections. They then control the spacecraft via digital communications, translated
by the spacecraft’s own computer into signals controlling the jets.) While pure calculation may be
done at any speed and may be interrupted and resumed without loss, communication is often and
control is almost always a real-time function. The person listening to a radio transmission or
telephone call hears it at exactly the same pace and almost exactly the same time as the person
speaking into the microphone; while the communication may be interrupted and resumed, this is
usually inconvenient and undesirable. The servomechanism or mechanical linkage controlling a
machine such as a car operates at exactly the same pace as the machine itself. If such control is
interrupted, the machine stops or, worse, continues to operate “out of control.”
exclusively of numbers or, eventually, of other symbols printed on paper or
punched on cards. In most of the first machines, both decimal numbers and
instructions had to be translated into binary form. Each computer’s internal
structure being virtually unique, none could communicate with others. Neither
(with the exception of printers and card punches) could they control other
machines.

Deep theoretical linkages among the three functions were already being
articulated in the communication and information theories of Norbert Wiener
and Claude Shannon. But these theoretical insights did not dictate any particular
path for computer development. Nor did they mandate digital equipment. The
idea of combining the three functions in a single machine, and of having that
machine be an electronic digital computer, came not just from theory -- both
Shannon and Wiener, for example, were also interested in other types of
machines67 -- but from the evolution of practical design projects in social and
cultural context.

The idea of using digital calculation for control functions involved no


special leap of insight, since the role of any kind of computer in control is
essentially to solve mathematical functions. (Indeed, the RCA engineer Jan
Rajchman attempted to construct a digital fire-control computer for antiaircraft
guns in the early 1940s.)68 But unlike then-extant digital machines, analog
computers integrated very naturally with control functions, since their inputs
and outputs were often signals of the same type as those required to control other
machines (e.g., electric voltages or the rotation of gears).69 The conversion of
data into and out of numerical form thus constituted a difficult extra step that
could often be bypassed. In addition, because many electrical devices, including
vacuum tubes, have analog as well as digital properties, the increasing shift from
electro-mechanical to electronic control techniques had little bearing on the
question of digital vs. analog equipment. In fact, some of the wartime analog
computers, such as Bush’s RDA and the Bell gun directors discussed below, used
electronic components.70 Finally, the wartime investment in digital computing

67 Shannon, for example, presented an analog maze-solving machine to the eighth Macy
Conference. Heinz von Foerster, ed., Transactions of the Eighth Conference on Cybernetics (New
York: Josiah Macy Jr. Foundation, 1952), 173–180. Also see Shannon, “Computers and Automata,”
Proceedings of the IRE, Vol. 41 (1953), 1234–1241. Norbert Wiener, Cybernetics (Cambridge, MA:
MIT Press, 1948), describes a wide variety of devices; after the war Wiener became deeply
interested in prosthetics.
68 See Jan Rajchman, “Early Research on Computers at RCA,” in Metropolis, Howlett, and Rota, A
History of Computing in the Twentieth Century, 465–469.
69 David Mindell has recently argued that Bush’s Differential Analyzer was not primarily a
calculating engine, but a real-time control system. See David Mindell, “From Machinery to
Information: Control Systems Research at MIT in the 1930s,” paper presented at Society for the
History of Technology Annual Meeting, Lowell, MA, 1994, 18.
70 The future role of digital equipment in communication itself was also far from clear, since almost
all electronic communication technologies after the (digital) telegraph employed (analog)
represented by ENIAC shrank into insignificance when compared with the
wartime program in radar and control systems research, which were primarily
analog technologies, with the result that far more engineers understood analog
techniques than grasped the new ideas in digital computing.

Many of the key actors in computer development, such as Bell


Laboratories and MIT, had major and long-standing investments in analog
computer technologies. For example, in 1945, as the ENIAC was being completed,
Bell Labs was commissioned to develop the Nike-Ajax antiaircraft guided missile
system for the Army. Bell proposed a “command-guidance” technique in which
radar signals would be converted into missile guidance instructions by ground-
based analog computers.71 Likewise, one of MIT’s major wartime research
groups was the Servomechanisms Laboratory, which built analog control devices
for antiaircraft gun directors and other uses.72

With a vigorous tradition of analog computation and control engineering


already in place after the war, work proceeded rapidly on general-purpose
electronic analog computers. A number of mammoth machines, such as RCA’s
Typhoon, were constructed under both corporate and military sponsorship. Mina
Rees, then director of the ONR’s Mathematical Sciences Division, noted in a 1950
public report on federal support for computer research that the ONR continued
to fund a variety of analog machines. She pointed to the robust health of the
analog computer and control industry as one reason the ONR’s analog program
was not even larger. “There is,” she pointed out, “vastly more analog than digital
equipment that has been built without government support, but . . . the
government and its contractors make extensive use of the equipment.” Rees also
praised the “broad point of view that recognizes merit in both the analog and the
digital aspects of the computer art.”73 Analog engineers thought their computers
could compete directly with digital devices in any arena that did not demand
enormous precision.

These machines and the social groups centered around them (such as
industrial research laboratories, university engineering schools, and equipment
manufacturers) constituted a major source of resistance to the emerging digital

waveforms, converting sound waves into radio or electrical waves and back again. However, digital
switches -- relays -- were the primary control elements of the telephone network, and a source of
some of the early work on digital computers. As we will see in chapter 8, Claude Shannon’s wartime
work on encryption of voice transmissions produced the first approaches to digitizing sound waves.
71 C. A. Warren, B. McMillan, and B. D. Holbrook, “Military Systems Engineering and Research,”
in Fagen, A History of Engineering and Science in the Bell System, 618.
72 40,000 of the analog 40mm gun directors designed by the Servomechanisms Lab were
manufactured during the war. Its budget, by the war’s end, was over $1 million a year. Wildes and
Lindgren, A Century of Electrical Engineering and Computer Science at MIT, 211.
73 Mina Rees, “The Federal Computing Machine Program,” Science, Vol. 112 (1950), 732. Reprinted
in Annals of the History of Computing, Vol. 7, No. 2 (1985), 156–163.
paradigm, especially when it came to using the new machines for purposes other
than mathematical calculation. In the words of one participant,

Analog computer experts felt threatened by digital computers.


World War II, with its emphasis on automatic pilots and
remotely controlled cannon, fostered the analog computer-servo
engineering profession. . . . Many analog computer engineers
were around following the war, but so great was the newly
realized demand for control devices that the colleges began
training increasing numbers. . . . [O]nly a relatively few servo
engineers were able to make the transition to digital machines. . .
. In 1945 . . . we confidently expected that factories would have
become softly humming hives of selsyn motors, amplidyne
generators and analog computers by the year 1960.74

Even as late as 1950, among the groups then developing digital machines,
the heritage of World War II analog equipment proved difficult to overcome.
When a Rand team seeking a programmable digital machine toured the
country’s major computer projects, “what [they] found was discouraging.” Many
of the groups working on reliability and high-speed computing were exploring
“modifications of radar technology, which was largely analog in nature. . . . They
were doing all kinds of tweaky things to circuits to make things work. It was all
too whimsical.”75

In addition to social inertia, easy availability, and an acculturated


preference for analog technology, there were many other reasons why
sophisticated engineers might reject electronic digital computers for most
purposes during the 1940s and early 1950s. First, the electronic components of the
day were not very reliable. As we have seen, most scientists scoffed at the idea
that a machine containing vast numbers of vacuum tubes could ever function
for more than a few minutes at a time without breaking down. Thus to
contemplate using electronic digital machines for control functions, in real time
and in situations where safety and/or reliability were issues, seemed
preposterous to many. Second, early electronic computers were huge assemblies,
the size of a small gymnasium, that consumed power voraciously and generated
tremendous heat. They often required their own power supplies, enormous air
conditioners, and even special buildings. Miniaturization on the scale we take
for granted today had not emerged even as a possibility. Third, they were

74 George E. Valley Jr., “How the SAGE Development Began,” Annals of the History of Computing,
Vol. 7, No. 3 (1985), 218.
75 Fred J. Gruenberger, “The History of the JOHNNIAC,” Annals of the History of Computing, Vol.
1, No. 1 (1979), 50.
extremely expensive (by the standards of analog equipment), and they demanded
constant and costly maintenance. Finally, early electronic computers employed
exotic materials and techniques, such as mercury delay line memory and the
cantankerous electrostatic storage tube, which added their own problems to the
issues of cost and reliability.

Even once it became clear (in the late 1940s) that electronic digital
computers would work, could be made reasonably reliable, and could operate at
speeds far outstripping their mechanical and electro-mechanical counterparts,
another issue prevented them from being seriously considered for control
functions. As George Valley, one of the leaders of the SAGE project, pointed out
in a 1985 retrospective, “relatively few wanted to connect computers to the real
world, and these people seemed to believe that the sensory devices would all
yield data. In fact, only some sensors -- such as weighing machines, odometers,
altimeters, the angle-tracking part of automatic tracking radars -- had built-in
counters. Most sensory devices relied on human operators to interpret noisy and
complex signals.”76 The problem lay in designing sensory devices that produced
direct numerical inputs for the computer to calculate with. Analog control
technologies did not require such conversions, because they represented
numerical quantities directly through physical parameters.77

In 1949, according to Valley, “almost all the groups that were realistically
engaged in guiding missiles . . . thought exclusively in terms of analog
computers.”78 A notable exception was the Northrop Snark missile project,
which engaged Eckert and Mauchly to build the BINAC digital computer,
completed in 1949, for its guidance system. However, the BINAC did not work
well, and Northrop engineers afterward moved toward special-purpose digital
differential analyzers -- and away from stored-program general-purpose
computers -- for the project.79

As late as 1960 Albert Jackson, manager of data processing for the TRW
Corporation, could write with authority, in a textbook on analog computation,

76 Valley, “How the SAGE Development Began,” 207. Italics added.


77 A simple example can illustrate this point. An ordinary thermostat is an analog control device
with a digital (on/off, discrete-state) output. A coiled bimetal strip inside the thermostat varies
in length continuously with the ambient temperature. To it is affixed a horizontal tube containing a
drop of mercury. As the temperature drops, the coil shrinks and the tube tilts past the horizontal.
This causes the mercury to flow to one end of the tube, closing an electrical connection that activates
the heating system. As the temperature then rises, the coil slowly expands, the tube eventually
tilts the other way, and the mercury flows to its other end. This deactivates the switch, and the
heater turns off. This device might thus be said to compute a function: if temperature t ≤
temperature setting s, heater setting h = 0 (off). If t < s, h = 1 (on). This function could be computed
using numerical inputs for t and s. But since the function is represented directly in the device, no
conversion of physical quantities into numerical values is required.
78 Valley, “How the SAGE Development Began,” 206.
79 See Stern, From ENIAC to Univac.
that electronic analog computers retained major advantages. They would always
be better at control functions and most simulations, as well as faster than digital
devices.

The [general-purpose] electronic analog computer is a very fast


machine. Each operational unit can be likened to a digital
arithmetical unit, memory unit, and control unit combined.
Since as many as 100 to 500 of these units will be employed in
parallel for a particular problem setup, it can be seen why an
analog computer is faster than a digital machine, which seldom
has more than one arithmetical unit and must perform
calculations bit by bit or serially. Because of their high speed,
electronic analog computers have found wide application as
real-time simulators and control-system components.

Only in the 1980s did efficient digital parallel processing become possible,
motivated in part by precisely this issue of real-time control. Jackson continued:

In conclusion, analog computers have found and will continue


to find wide application to problems where the knowledge of the
physical situation does not permit formulation of a numerical
model of more than four significant digits or where, even if such
a model could be designed, the additional time and expense
entailed in digital computation would not be warranted because
of other factors.80

Clearly, in the decade following World War II digital computers were a


technology at the early phase of development that Trevor Pinch and Wiebe
Bijker describe as, in essence, a solution in search of a problem. The technology
of digital computation had not yet achieved what they call “closure,” or that state
of technical development and social acceptance in which large constituencies
generally agree on its purpose, meaning, and physical form.81 The shape of
computers, as tools, was still extremely malleable, and their capacities remained
to be envisioned, proven, and established in practice. Thus the use of digital
devices to create automated, centralized military command-control systems was
anything but foreordained.

80 AlbertS. Jackson, Analog Computation (New York: McGraw-Hill, 1960), 8.


81 Trevor Pinch and Wiebe Bijker, “The Social Construction of Facts and Artifacts,” in Bijker,
Hughes, and Pinch, The Social Construction of Technology, 17–50.
Computers Take Command

The utilitarian account of military involvement in computer development also


fails to explain one of the major paradoxes of military automation. Computers
were used first to automate calculation, then to control weapons and guide
aircraft, and later to analyze problems of command through simulation. The
final step in this logic would be the eventual automation of command itself;
intermediate steps would centralize it and remove responsibilities from lower
levels. Military visionaries and defense intellectuals continually held out such
centralization as some kind of ultimate goal, as in General Westmoreland’s
dream of the electronic battlefield. By the mid-1980s, DARPA projects envisioned
expert systems programs to analyze battles, plot strategies, and execute responses
for carrier battle group commanders. The Strategic Computing Initiative
program announcement claimed that in “the projected defense against strategic
nuclear missiles . . . systems must react so rapidly that it is likely that almost
complete reliance will have to be placed on automated systems” and proposed to
develop their building blocks.82 DARPA’s then-director Robert Cooper asserted,
in an exchange with Senator Joseph Biden, that with sufficiently powerful
computers, presidential errors in judgment during a nuclear confrontation
might be rendered impossible: “we might have the technology so he couldn’t
make a mistake.”83

The automation of command clearly runs counter to ancient military


traditions of personal leadership, decentralized battlefield command, and
experience-based authority.84 By the early 1960s, the beginning of the McNamara
era and the early period of the “electronic battlefield,” many military leaders had
become extremely suspicious of the very computers whose development their
organizations had led. Those strategists who felt the necessity and promise of
automation described by Jay Forrester were opposed by others who saw that the
domination of strategy by preprogrammed plans left no room for the
extraordinarily contingent nature of battlefield situations. In 1964, Air Force
Colonel Francis X. Kane reported in the pages of Fortune magazine that “much
of the current planning for the present and future security of the U.S. rests on

82 Defense Advanced Research Projects Agency, Strategic Computing (Washington, DC: Defense
Advanced Research Projects Agency, 1983), 8.
83 Jonathan Jacky, “The Strategic Computing Program,” in David Bellin and Gary Chapman, eds.,
Computers in Battle (New York: Harcourt Brace, 1987), 184.
84 For entrées to the large literature on this topic, see Gary Chapman, “The New Generation of
High-Technology Weapons,” in Bellin and Chapman, Computers in Battle, 61–100; Chris Hables
Gray, Computers as Weapons and Metaphors: The U.S. Military 1940–1990 and Postmodern War
(unpublished Ph.D. thesis, University of California, Santa Cruz, 1991); and Morris Janowitz, T h e
Professional Soldier (Glencoe, IL: Free Press, 1960).
computerized solutions.” It was, he wrote, impossible to tell whether the actual
results of such simulated solutions would occur as desired, because

we have no experience in comparing the currently accepted


theory of predicting wars by computer with the actual practice of
executing plans. But I believe that today’s planning is inadequate
because of its almost complete dependence on scientific
methodology, which cannot reckon with those acts of will that
have always determined the conduct of wars. . . . In today’s
planning the use of a tool -- the computer -- dictates that we
depend on masses of data of repeated events as one of our
fundamental techniques. We are ignoring individual experience
and depending on mass experience instead.85

Also in the early 1960s occasional articles in the armed forces journal
Military Review began warning of “electronic despotism” and “demilitarized
soldiers” whose tasks would be automated to the point that the men would be
deskilled and become soft.86 Based on interviews with obviously disaffected
commanders, U.S. News & World Report reported in 1962 -- under the banner
headline “Will ‘Computers’ Run the Wars of the Future?” -- that “military men
no longer call the tunes, make strategy decisions and choose weapons. In the
Pentagon, military men say they are being forced to the sidelines by top civilians,
their advice either ignored or not given proper hearing. . . . In actual defense
operations, military commanders regard themselves as increasingly dependent
on computer systems.”87 While these reports certainly exaggerated the actual
role of computers in military planning and especially in military operations at
the time, their existence shows that the view of computers as a solution to
military problems faced internal opposition from the start. They also
demonstrate how deeply an ideology of computerized command and control had
penetrated into U.S. military culture.

The automation theory alone, then, explains neither the urgency, the
magnitude, nor the specific direction of the U.S. military effort in computing.
Rather than explain how contests over the nature and potential of computers
were resolved, a utilitarian view writes history backwards, using the results of
those contests to account for their origins.

85 Colonel Francis X. Kane, USAF, "Security Is Too Important To Be Left to Computers," Fortune,
Vol. 69, No. 4 (1964), 146–147.
86 Ferdinand Otto Miksche, “The Soldier and Technical Warfare,” Military Review, Vol. 42, No. 8
(1962), 71–78; Major Keith C. Nusbaum, U.S. Army, “Electronic Despotism: A Serious Problem of
Modern Command,” Military Review, Vol. 42, No. 4 (1962), 31–39.
87 “Will ‘Computers’ Run Wars of the Future?,” U.S. News & World Report (April 23, 1962), 44–48.
Nor does a utilitarian view explain the pervasive military fascination
with computers epitomized by General Westmoreland’s speech in the aftermath
of Vietnam. “I see,” he proclaimed, “an Army built into and around an
integrated area control system that exploits the advanced technology of
communications, sensors, fire direction, and the required automatic data
processing -- a system that is sensitive to the dynamics of the ever-changing
battlefield -- a system that materially assists the tactical commander in making
sound and timely decisions.”88 This is the language of vision and technological
utopia, not practical necessity. It represents a dream of victory that is bloodless for
the victor, of battle by remote control, of speed approaching the instantaneous,
and of certainty in decision-making and command. It is a vision of a closed
world, a chaotic and dangerous space rendered orderly and controllable by the
powers of rationality and technology.

Why build computers? In this chapter I have tried to show that not only
the answers, but also the very question, are complex. Their importance to the
future of U.S. military power was by no means obvious at the outset. To
understand how it became so, we must look closely at the intricate chains of
technological advances, historical events, government policies, and emergent
metaphors comprising closed-world discourse. For though policy choices at the
largest levels determined research directions, in some cases quite specifically,
defining digital computation as relevant to national priorities was not itself a
policy issue. Instead it involved a complicated nexus of technological choices,
technological traditions, and cultural values. In fact, digital computer research
itself ended up changing national priorities, as we will see in the following
chapters.

88 Westmoreland, “Address,” 222.

You might also like