2 Why Build Computers? The Military Role in Computer Research
2 Why Build Computers? The Military Role in Computer Research
2 Why Build Computers? The Military Role in Computer Research
On the battlefield of the future, enemy forces will be located, tracked, and
targeted almost instantaneously through the use of data links, computer
assisted intelligence evaluation, and automated fire control. With first round
kill probabilities approaching certainty, and with surveillance devices that
can continually track the enemy, the need for large forces to fix the opposition
physically will be less important. . . . [A]n improved communicative system . . .
would permit commanders to be continually aware of the entire battlefield
panorama down to squad and platoon level. . . . Today, machines and
technology are permitting economy of manpower on the battlefield. . . . But the
future offers even more possibilities for economy. I am confident the American
people expect this country to take full advantage of its technology -- to
welcome and applaud the developments that will replace wherever possible
the man with the machine. . . . With cooperative effort, no more than 10 years
should separate us from the automated battlefield.1
For two decades, from the early 1940s until the early 1960s, the armed forces of
the United States were the single most important driver of digital computer
development. Though most of the research work took place at universities and
in commercial firms, military research organizations such as the Office of Naval
Research, the Communications Security Group (known by its code name OP-20-
G), and the Air Comptroller’s Office paid for it. Military users became the
proving ground for initial concepts and prototype machines. As the commercial
computer industry began to take shape, the armed forces and the defense
industry served as the major marketplace. Most historical accounts recognize the
financial importance of this backing in early work on computers. But few, to
date, have grasped the deeper significance of this military involvement.
At the end of World War II, the electronic digital computer technology we
take for granted today was still in its earliest infancy. It was expensive, failure-
prone, and ill-understood. Digital computers were seen as calculators, useful
primarily for accounting and advanced scientific research. An alternative
1 General William Westmoreland, U.S. Army Chief of Staff, “Address to the Association of the
U.S. Army.” Reprinted in Paul Dickson, The Electronic Battlefield (Bloomington, IN: Indiana
University Press, 1976), 215–223.
technology, analog computing, was relatively cheap, reliable (if not terribly
accurate), better developed, and far better supported by both industrial and
academic institutions. For reasons we will explore below, analog computing was
more easily adapted to the control applications that constituted the major uses of
computers in battle. Only in retrospect does it appear obvious that command,
control, and communications should be united within a single technological
frame (to use Wiebe Bijker’s term) centered around electronic digital
computers.2
Why, then, did military agencies provide such lavish funding for digital
computer research and development? What were their near-term goals and
long-term visions, and how were these coupled to the grand strategy and
political culture of the Cold War? How were those goals and visions shaped over
time, as computers moved out of laboratories and into rapidly changing military
systems?
I will argue that military support for computer research was rarely benign
or disinterested, as many historians, taking at face value the public postures of
funding agencies and the reports of project leaders, have assumed. Instead,
practical military objectives guided technological development down particular
channels, increased its speed, and helped shape the structure of the emerging
computer industry. I will also argue, however, that the social relations between
military agencies and civilian researchers were by no means one-sided. More
often than not it was civilians, not military planners, who pushed the
application of computers to military problems. Together, in the context of the
Cold War, they enrolled computers as supports for a far-reaching discourse of
centralized command and control -- as an enabling, infrastructural technology
for the closed-world political vision.
During World War II, virtually all computer research (like most scientific
research and development) was funded directly by the War Department as part of
the war effort. But there are particularly intimate links between early digital
computer research, key military needs, and the political fortunes of science and
engineering after the war. These connections had their beginnings in problems
of ballistics.
One of the Allies’ most pressing problems in World War II was the feeble
accuracy of antiaircraft guns. Airplanes had evolved enormously since World
2 See Wiebe E. Bijker, “The Social Construction of Bakelite: Toward a Theory of Invention,” in
Bijker, Hughes, and Pinch, The Social Construction of Technology, 159–187.
War I, gaining speed and maneuverability. Defense from devastating bombing
raids depended largely on ground-based antiaircraft weapons. But judging how
far ahead of the fast-moving, rapidly turning planes to aim their guns was a task
beyond the skills of most gunners. Vast amounts of ammunition were expended
to bring down a distressingly small number of enemy bombers. The German V-I
“buzz bombs” that attacked London in 1944 made a solution even more urgent.
The problem was solved by fitting the guns with “servomechanisms” -- which
combine a kind of mechanical or electro-mechanical analog computer with a
control mechanism -- able to calculate the plane’s probable future position.3
In World War II, with its constant and rapid advances in gunnery,
Aberdeen’s work became a major bottleneck in fielding new artillery and
3 Mechanical systems are classical Aristotelian machines (e.g., car engines) that perform work
using the physical movement of levers, gears, wheels, etc. Electro-mechanical systems are
machines powered by electric motors or electromagnets (e.g., vacuum cleaners), but part or all of
whose function is still performed through the physical movement of parts. Electronic systems, by
contrast, contain few or no moving parts. They consist entirely of electrical circuitry and perform
their work through transformations of electric current (e.g., televisions or stereo systems).
The distinction between d i g i t a l and analog methods corresponds closely to the more
intuitive difference between counting and measuring. Digital calculation uses discrete states, such
as the ratchet-like detents of clockwork gears (mechanical), the on-off states of relays (electro-
mechanical switches), or the positive or negative electrical states of transistors (electronic), to
represent discrete numerical values (1, 2, 3, etc.). These values can then be added, subtracted, and
multiplied, essentially by a process of counting. Analog calculation, by contrast, employs
continuously variable states, such as the ratio between the moving parts of a slide rule
(mechanical), the speed of a motor’s rotation (electro-mechanical), or the voltage of a circuit
(electronic), to represent continuously variable numerical quantities (e.g., any value between 0 and
10). These quantities can then be physically combined to represent addition, subtraction, and
multiplication, for example as someone might measure the perimeter of a room by cutting pieces of
string to the length of each wall and then tying them together.
4 Michael R. Williams, A History of Computing Technology (Englewood Cliffs, NJ: Prentice-Hall,
1985), 78–83.
5 Pesi R. Masani, Norbert Wiener, 1894–1964 (Boston: Birkhäuser, 1990), 68.
antiaircraft systems. Both Wiener and Veblen -- by then distinguished professors
at MIT and Princeton respectively -- once again made contributions. Wiener
worked on the antiaircraft gunnery problem at its most general level. His
wartime studies culminated in the theory of cybernetics (a major precursor of
cognitive psychology). Veblen returned to Aberdeen’s ballistics work as head of
the scientific staff of the Ballistics Research Laboratory (BRL). Just as in World
War I, Veblen’s group employed hundreds of people, this time mostly women,
to compute tables by hand using desk calculators. These women, too, were called
“computers.” Only later, and gradually, was the name transferred to the
machines.6
But alongside them Aberdeen also employed the largest analog calculator
of the 1930s: the differential analyzer, invented by MIT electrical engineer
Vannevar Bush.
Bush invented the differential analyzer at MIT in 1930 to assist in the solution of
equations associated with large electric power networks. The machine used a
system of rotating disks, rods, and gears powered by electric motors to solve
complex differential equations (hence its name). The BRL immediately sought to
copy the device, with improvements, completing its own machine in 1935 at
Aberdeen. At the same time, another copy was constructed at the University of
Pennsylvania’s Moore School of Engineering in Philadelphia, this one to be used
for general-purpose engineering calculation. The Moore School’s 1930s
collaboration with the BRL, each building a differential analyzer under Bush’s
supervision, was to prove extremely important. During World War II, the two
institutions would collaborate again to build the ENIAC, America’s first full-scale
electronic digital computer.
Bush was perhaps the single most important figure in American science
during World War II, not because of his considerable scientific contributions but
because of his administrative leadership. As war approached, Bush and some of
his distinguished colleagues had used their influence to start organizing the
scientific community for the coming effort. After convincing President
6 The transfer of the name “computer” to the machine was by no means immediate. The Reader’s
Guide to Periodical Literature does not list “computer” as a heading until 1957. Most news articles
of the 1945–1955 period place the word, if they use it at all, in scare quotes. See Paul E. Ceruzzi,
“When Computers Were Human,” Annals of the History of Computing, Vol. 13, No. 3 (1991),
237–244; Henry S. Tropp et al., “A Perspective on SAGE: Discussion,” Annals of the History of
Computing, Vol. 5, No. 4 (1983), 375–398; Jean J. Bartik and Frances E. Holberton, interviewed by
Richard R. Mertz, 5/28/70, Smithsonian Computer Oral History, AC NMAH #196 (Archive Center,
National Museum of American History, Washington, D.C.).
Roosevelt that close ties between the government and scientists would be critical
to this war, they established the National Defense Research Committee (NDRC)
in 1940, with Bush serving as chair. When the agency’s mandate to conduct
research but not development on weapons systems proved too restrictive, Bush
created and took direction of an even larger organization, the development-
oriented Office of Scientific Research and Development (OSRD), which
subsumed the NDRC.7 The OSRD coordinated and supervised many of the huge
science and engineering efforts mobilized for World War II. By 1945 its annual
spending exceeded $100 million; the prewar total for military R&D had been
about $23 million.8
Academic and industrial collaboration with the military under the OSRD
was critically important in World War II. Research on radio, radar, the atomic
bomb, submarines, aircraft, and computers all moved swiftly under its
leadership. Bush’s original plans called for a decentralized research system in
which academic and industrial scientists would remain in their home
laboratories and collaborate at a distance. As the research effort expanded,
however, this approach became increasingly unwieldy, and the OSRD moved
toward a system of large central laboratories.
Contracts with universities varied, but under most of them the university
provided laboratory space, management, and some of the scientific personnel for
large, multidisciplinary efforts. The Radio Research Laboratory at Harvard
employed six hundred people, more of them from California institutions than
from Harvard itself. MIT’s Radiation Laboratory, the largest of the university
research programs, ultimately employed about four thousand people from sixty-
nine different academic institutions.9 Academic scientists went to work for
industrial and military research groups, industrial scientists assisted universities,
and the military’s weapons and logistics experts and liaison officers were
frequent visitors to every laboratory. The war effort thus brought about the most
radical disciplinary mixing, administrative centralization, and social
reorganization of science and engineering ever attempted in the United States.
7 On the NDRC and OSRD, see Daniel Kevles, The Physicists (New York: Alfred A. Knopf, 1971);
David Dickson, The New Politics of Science (New York: Pantheon, 1984); and Bruce L. R. Smith,
American Science Policy since World War II (Washington DC: Brookings Institution, 1991).
8 Paul Forman, “Behind Quantum Electronics: National Security as Basis for Physical Research in
the United States, 1940–1960,” Historical Studies in the Physical and Biological Sciences, Vol. 18,
No. 1 (1987), 152.
9 James Phinney Baxter, Scientists Against Time (Boston: Little, Brown, 1946), 21. On the Radiation
Laboratory see also Karl L. Wildes and Nilo A. Lindgren, A Century of Electrical Engineering and
Computer Science at MIT, 1882–1982 (Cambridge, MA: MIT Press, 1985).
interwar years -- but it added the new ingredient of massive government
funding and military direction. MIT, for example, “emerged from the war with a
staff twice as large as it had had before the war, a budget (in current dollars) four
times as large, and a research budget ten times as large -- 85 percent from the
military services and their nuclear weaponeer, the AEC.”10 Eisenhower
famously named this new form the “military-industrial complex,” but the nexus
of institutions is better captured by the concept of the “iron triangle” of self-
perpetuating academic, industrial, and military collaboration.11
Thus it so happened that the figure most central to World War II science
was also the inventor of the prewar period’s most important computer
technology. Bush’s laboratory at MIT had established a tradition of analog
computation and control engineering -- not, at the time, separate disciplines -- at
the nation’s most prestigious engineering school. This tradition, as we will see,
weighed against the postwar push to build digital machines. Simultaneously,
though, the national science policies Bush helped create had the opposite effect.
The virtually unlimited funding and interdisciplinary opportunities they
provided encouraged new ideas and new collaborations, even large and
expensive ones whose success was far from certain. Such a project was the Moore
School’s Electronic Numerical Integrator and Calculator (ENIAC), the first
American electronic digital computer.
Even with the help of Bush’s differential analyzer, compiling ballistics tables for
antiaircraft weapons and artillery involved tedious calculation. Tables had to be
produced for every possible combination of gun, shell, and fuse; similar tables
were needed for the (analog) computing bombsight and for artillery pieces. Even
with mechanical aids, human “computers” made frequent mistakes,
necessitating time-consuming error-checking routines. The BRL eventually
commandeered the Moore School’s differential analyzer as well. Still, with two
of these machines, the laboratory fell further and further behind in its work.
13 Ibid., 229. J.C.R. Licklider speculated that Bush’s dislike of digital machines stemmed from an
argument with Norbert Wiener over whether binary arithmetic was the best base to use. See John
A. N. Lee and Robert Rosin, “The Project MAC Interviews,” IEEE Annals of the History of
Computing, Vol. 14, No. 2 (1992), 22.
14 See, for example, Bush’s prescient article on the “memex,” a kind of hypertext technology, in “As
We May Think,” Atlantic Monthly 176, 1945. Reprinted in Zenon Pylyshyn, ed., Perspectives on the
Computer Revolution (Englewood Cliffs: Prentice-Hall, 1970), 47–59.
“The automation of this process was . . . the raison d’être for the first
electronic digital computer,” wrote Herman Goldstine, co-director of the ENIAC
project. The best analog computers, even those built during the war, were only
“about 50 times faster than a human with a desk machine. None of these [analog
devices were] sufficient for Aberdeen’s needs since a typical firing table required
perhaps 2,000–4,000 trajectories. . . . The differential analyzer required perhaps
750 hours -- 30 days -- to do the trajectory calculations for a table.”15 (To be
precise, however, these speed limitations were due not to the differential
analyzer’s analog characteristics, but to its electro-mechanical nature. Electronic
equipment, performing many functions at the speed of light, could be expected to
provide vast improvements. As Bush’s RDA had demonstrated, electronic
components could be used for analog as well as digital calculation. Thus nothing
in Aberdeen’s situation dictated a digital solution to the computation bottleneck.)
15 Herman Goldstine, The Computer from Pascal to von Neumann (Princeton, NJ: Princeton
University Press, 1972), 135–136. As noted in chapter 1, the British Colossus, not the ENIAC, was
actually the first electronic digital computer. Because of British secrecy, Goldstine may not have
known this when he wrote the cited lines in 1972.
16 Williams, History of Computing Technology; Stan Augarten, Bit by Bit: An Illustrated History
of Computers (New York: Ticknor & Fields, 1984); 119–120.
17 Williams, History of Computing Technology, 275. The estimate of part failure probability is due
to Herman Goldstine.
When completed in 1945, the ENIAC filled a large room at the Moore
School with equipment containing 18,000 vacuum tubes, 1500 relays, 70,000
resistors, and 10,000 capacitors. The machine consumed 140 kilowatts of power
and required internal forced-air cooling systems to keep from catching fire. The
gloomy forecasts of tube failure turned out to be correct, in one sense: when the
machine was turned on and off on a daily basis, a number of tubes would burn
out almost every day, leaving it nonfunctional about 50 percent of the time, as
predicted. Most failures, however, occurred during the warm-up and cool-down
periods. By the simple (if expensive) expedient of never turning the machine off,
the engineers dropped the ENIAC’s tube failures to the more acceptable rate of
one tube every two days.18
The great mathematician John von Neumann became involved with the
ENIAC project in 1944, after a chance encounter with Herman Goldstine on a
train platform. By the end of the war, with Eckert, Mauchly, and others, von
Neumann had planned an improved computer, the EDVAC. The EDVAC was
the first machine to incorporate an internal stored program, making it the first
true computer in the modern sense.19 (The ENIAC was programmed externally,
using switches and plugboards.) The plan for the EDVAC’s logical design served
as a model for nearly all future computer control structures -- often called “von
Neumann architectures” -- until the 1980s.20
Initially budgeted at $150 thousand, the ENIAC finally cost nearly half a
million dollars. Without the vast research funding and the atmosphere of
desperation associated with the war, it probably would have been years, perhaps
decades, before private industry attempted such a project. The ENIAC became,
like radar and the bomb, an icon of the miracle of government-supported “big
science.”
18 Ibid., 285.
19 In contemporary parlance the word “computer” refers to electronic digital machines with a
memory and one or more central processing units. In addition, to qualify as a computer a device must
be capable of (a) executing conditional branching (i.e., carrying out different sets of instructions
depending upon the results of its own prior calculations) and (b) storing these instructions
(programs) internally. Babbage’s Analytical Engine had most of these features but would have
stored its programs externally on punched wooden cards. Somewhat like the Analytical Engine, the
EDVAC was only partially completed; thus it represents the first true computer design but not the
first actually operating computer. Instead, the British Manchester University Mark I achieved
that honor in 1948. See Williams, History of Computing Technology, 325; Augarten, Bit by Bit, 149.
20 Goldstine distributed this plan, under von Neumann’s name but unbeknownst to him, as the
famous and widely read “Draft Report on the EDVAC.” Because von Neumann’s name was on its
cover, the misunderstanding arose that he was the report’s sole author. But many of the Draft
Report’s key ideas actually originated with Eckert, Mauchly, and other members of the ENIAC
design team. Because of this misunderstanding, which later escalated into a lawsuit, and the fame
he acquired for other reasons, von Neumann has received more credit for originating computer
design than he probably deserved. See Augarten, Bit by Bit, 136ff. The most essential feature of the
so-called von Neumann architecture is serial (one-by-one) processing of the instruction stream.
The ENIAC was not completed until the fall of 1945, after the war had
ended. The ballistics tables ENIAC was built to compute no longer required
urgent attention. But the ENIAC was a military machine, and so it was
immediately turned to the military ends of the rapidly emerging Cold War. The
first problem programmed on the machine was a mathematical model of a
hydrogen bomb from the Los Alamos atomic weapons laboratories. The ENIAC,
unable to store programs or retain more than twenty ten-digit numbers in its
tiny memory, required several weeks in November 1945 to run the program in a
series of stages. The program involved thousands of steps, each individually
entered into the machine via its plugboards and switches, while the data for the
problem occupied one million punch cards. The program’s results exposed
several problems in the proposed H-bomb design. The director of Los Alamos
expressed his thanks to the Moore School in March 1946, writing that “the
complexity of these problems is so great that it would have been impossible to
arrive at any solution without the aid of ENIAC.”21
This event was symbolic of a major and portentous change. The wartime
alliance of academic and industrial science with the military had begun as a
temporary association for a limited purpose: winning a war against aggressors.
Now it was crystallizing into a permanent union.
21 Augarten, Bit by Bit, 130–131, citing Nancy Stern, From ENIAC to UNIVAC (Boston: Digital
Press, 1981). Whether the ENIAC was actually able to solve this first problem is a matter of
debate among historians. Kenneth Flamm, citing an interview with Stanley Frankel (Smithsonian
Institution Computer Oral History Project, October 5, 1972, conducted by Robina Mapstone), holds
that it failed and that the calculations were actually carried out later at Eckert and Mauchly’s
UNIVAC factory in Philadelphia. See Kenneth Flamm, Targeting the Computer: Government
Support and International Competition (Washington, DC: Brookings Institution, 1987), 79.
22 Barnes, cited in Goldstine, The Computer from Pascal to von Neumann, 229.
the armed forces, the enormous scientific and engineering network assembled
for the war effort would be demobilized and thrown on its own resources.
In [fiscal year] 1938 the total U.S. budget for military research and
development was $23 million and represented only 30 percent of
all Federal R&D; in fiscal 1945 the OSRD alone spent more than
$100 million, the Army and Navy together more than $700
million, and the Manhattan Project more than $800 million. . . .
In the immediate postwar years total military expenditure
slumped to a mere seven times its prewar constant-dollar level,
while constant-dollar military R&D expenditure held at a full 30
times its prewar level, and comprised about 90 percent of all
federal R&D. In the early 1950s total military expenditure soared
again, reaching 20 times its prewar constant-dollar level, while
military R&D reattained, and before the end of the decade much
surpassed, its World War II high.23
Industrial R&D expenditures soared as well. By the late 1940s the total
amount of industrial R&D roughly equaled that sponsored by the federal
government, but as figure 2.1 shows, a decade later government R&D spending
was nearly double that of industry.
This trend, and the politics it reflected, resulted from the three concurrent
developments in postwar America politics. First, in the rapid transition from
World War II to the Cold War, the war’s key events served as anchoring icons
for postwar policies. Wartime institutions became blueprints for their postwar
counterparts. Second, the emerging politico-military paradox of a peacetime Cold
War generated a perceived need for new technology, justifying vast military
investments in research. Finally, fierce public debates about postwar federal
support for science and technology had ended in stalemate. Plans for a National
Science Foundation suffered long delays, and military agencies were left to fill
the resulting vacuum. Let us explore each of these developments in turn.
The only combatant nation to emerge from the war intact, the United
States had simultaneously left behind the economic depression and the political
isolationism of the 1930s. The magnitude of this change cannot be
overemphasized, since we have become used to a very different world.25 The
United States was a world industrial power before the Great Depression, but with
a few brief exceptions had played only a minor role in world political and
military affairs during a period when the European colonial empires still ruled
the globe. As late as 1939, the U.S. army numbered only 185,000 men, with an
annual budget under $500 million. America maintained no military alliances
the U.S. was producing 45 percent of the world’s arms and nearly
50 percent of the world’s goods. Two-thirds of all the ships afloat
were American built. . . . The conclusion of the war . . . found the
U.S. either occupying, controlling, or exerting strong influence
in four of the five major industrial areas of the world -- Western
Europe, Great Britain, Japan, and the U.S. itself. Only the Soviet
Union operated outside the American orbit. . . . The U.S. was the
only nation in the world with capital resources available to solve
the problems of postwar reconstruction.27
• The atomic bomb itself, credited with the rapid end to the war
in the Pacific, became the enigmatic symbol of both invincible
power and global holocaust.
The unfolding political crises of the Cold War were invariably interpreted
in these terms.28 For example, the Berlin blockade was perceived as another
potential Munich, calling for a hard-line response rather than a negotiation.
Truman interpreted Korea through the lens of World War II: “Communism was
acting in Korea just as Hitler, Mussolini and the Japanese had acted. . . . I felt
certain that if South Korea was allowed to fall, Communist leaders would be
emboldened to override nations closer to our own shores.”29 Critics of the 1950s
characterized continental air defense as a Maginot Line strategy for starry-eyed
technological optimists. U.S. forward basing of nuclear weapons, in positions
vulnerable to surprise air attack, was likened to the risk of another Pearl
Harbor.30 The Manhattan Project was invoked endlessly to rally support for
major R&D projects such as the space program. Finally, the growing nuclear
arsenal was a reminder of Hiroshima, both horror and symbol of ultimate
power, and it was simply assumed (for a while) that no nation would be willing
to stand up to a weapon of such destructive force.31
28 Rystad has named this interpretation the “Munich paradigm.” See Göran Rystad, Prisoners of
the Past? (Lund, Sweden: CWK Gleerup, 1982), 33 and passim.
29 Harry S Truman, Memoirs: Years of Trial and Hope, Vol. 2 (Garden City, NJ: Doubleday, 1956),
332–333.
30 Herbert York, Los Alamos physicist and Eisenhower advisor, noted that “because of Pearl
Harbor you didn’t have to discuss the notion of surprise attack. It was in your bones that the
Russians are perfidious, that surprise attack is the way wars always start, and that appeasement
doesn’t work.” Cited in Gregg Herken, Counsels of War (New York: Knopf, 1983), 125.
31 Atomic bombs, of course, were only one in a long line of weapons Americans (and others) believed
would make war too terrible to fight. See Franklin, War Stars.
Thus in many respects the Cold War was not a new conflict with
communism but the continuation of World War II, the transference of that
mythic, apocalyptic struggle onto a different enemy.32
The Cold War marked the first time in its history that America
maintained a large standing army in peacetime. But its geographical situation of
enormous distance from its enemies, combined with its antimilitarist ethic,
ensured that the institutional form taken by a more vigorous American military
presence would differ from the more traditional European and Soviet
approaches of large numbers of men under arms. Instead of universal
conscription, the United States chose the technological path of massive, ongoing
automation and integration of humans with machines. First Truman and then
Eisenhower, each balancing the contradictory goals of an expanding, activist
global role and a contracting military budget, relied ever more heavily on
nuclear weapons. By the end of the 1950s high technology -- smaller bombs with
higher yields, tactical atomic warheads for battlefield use, bombers of increasingly
long range, high-altitude spy planes, nuclear early warning systems, and rockets
to launch spy satellites and ICBMs -- had became the very core of American
global power.
In his famous 1945 tract Science: The Endless Frontier, composed at President
Roosevelt’s request as a blueprint for postwar science and technology policy,
Vannevar Bush called for a civilian-controlled National Research Foundation to
preserve the government-industry-university relationship created during the
war. In his plea for continuing government support, Bush cited the Secretaries of
War and Navy to the effect that scientific progress had become not merely
helpful but utterly essential to military security for the United States in the
modern world:
36 Secretaries
of War and Navy, joint letter to the National Academy of Sciences, cited in
Vannevar Bush, Science: The Endless Frontier (Washington: U.S. Government Printing Office,
1945), 12.
scientific and engineering schools and industrial organizations with the military
forces “so as to form a continuing, working partnership.”37
Bush also argued that modern medicine and industry were also
increasingly dependent on vigorous research efforts in basic science. The massive
funding requirements of such research could not be met by the cash-poor
academic community, while the industrial sector’s narrow and short-term goals
would discourage it from making the necessary investment. Consequently the
new foundation Bush proposed would have three divisions, one for natural
sciences, one for medical research, and one for national defense.
With major research programs created during the war in jeopardy, the
War Department moved into the breach, creating the Office of Naval Research in
1946. In a pattern repeated again and again during the Cold War, national
security provided the consensual justification for federally funded research. The
ONR, conceived as a temporary stopgap until the government created the NSF,
became the major federal force in science in the immediate postwar years and
remained important throughout the 1950s. Its mandate was extremely broad: to
fund basic research (“free rather than directed research”), primarily of an
unclassified nature.39
Yet the ONR’s funding was rarely, if ever, a purely altruistic activity.40 The
bill creating the office mentions the “paramount importance [of scientific
research] as related to the maintenance of future naval power, and the
preservation of national security”; the ONR’s Planning Division sought to
maintain “listening posts” and contacts with cutting-edge scientific laboratories
37 Bowles, cited in Wildes and Lindgren, A Century of Electrical Engineering and Computer Science
at MIT, 203.
38 Harry S Truman, cited in James L. Penick Jr. et al., The Politics of American Science: 1939 to the
Present, revised ed. (Cambridge, MA: MIT Press, 1972), 20.
39 The Vinson Bill creating the ONR, cited in ibid., 22.
40 For a somewhat different view of the ONR, see Harvey Sapolsky, Science and the Navy
(Princeton, NJ: Princeton University Press, 1990). Among the best, and certainly to date the most
detailed, discussions of the mixed motives of military support for postwar academic research is
Stuart W. Leslie, The Cold War and American Science: The Military-Industrial-Academic
Complex at MIT and Stanford (New York: Columbia University Press, 1993).
for the Navy’s possible use.41 Lawmakers were well aware that the ONR
represented a giant step down the road to a permanent federal presence in
science and engineering research and a precedent for military influence. House
Committee on Naval Affairs Chairman Carl Vinson opposed the continuing
executive “use of war powers in peacetime,” forcing the Navy to go to Congress
for authorization.42
By 1948 the ONR was funding 40 percent of all basic research in the United
States; by 1950 the agency had let more than 1,200 separate research contracts
involving some 200 universities. About half of all doctoral students in the
physical sciences received ONR support.43 ONR money proved especially
significant for the burgeoning field of computer design. It funded a number of
major digital computer projects, such as MIT’s Whirlwind, Raytheon’s
Hurricane, and Harvard’s Mark III.44 The NSF, finally chartered in 1950 after
protracted negotiations, did not become a significant funding source for
computer science until the 1960s (in part because computer science did not
become an organized academic discipline until then). Even after 1967, the only
period for which reliable statistics are available, the NSF’s share of total federal
funding for computer science hovered consistently around the 20 percent mark,
while DoD obligations ranged between 50 and 70 percent, or 60 to 80 percent if
military-related agencies such as the Department of Energy (responsible for
atomic weapons research) and NASA (whose rockets lifted military surveillance
satellites and whose research contributed to ballistic missile development) are
included.45
41 The Vinson Bill and the ONR Planning Division, cited in Penick et al., The Politics of American
Science, 22–23.
42 Vinson, quoted in Kent C. Redmond and Thomas M. Smith, Project Whirlwind (Boston: Digital
Press, 1980), 105.
43 Dickson, New Politics of Science, 118–119.
44 See Mina Rees, "The Computing Program of the Office of Naval Research, 1946–1953," Annals of
the History of Computing, Vol. 4, No. 2 (1982), 103–113.
45 The figure of 20 percent for NSF support is generous, since the budget category used includes both
mathematics and computer science research. On the DOE and NASA as auxiliary military
agencies, see Flamm, Targeting the Computer, 46 and passim. My discussion in this chapter relies
heavily on Flamm’s published account; on some points I am indebted to him for personal
communications as well. On NASA’s role as a civilian cover for military research and the U.S.
geostrategic aim of establishing international rights of satellite overflight (the “open skies”
policy) in order to obtain intelligence about Soviet military activities, see Walter A. McDougall,
...the Heavens and the Earth: A Political History of the Space Age (New York: Basic Books, 1985).
With the war’s end, some corporate funding became available for computer
research. A few of the wartime computer pioneers, such as ENIAC engineers
Mauchly and Eckert, raised commercial banners. The company they formed
developed the BINAC, the first American stored-program electronic computer,
and then the UNIVAC, the first American commercial computer.46
47 M. D. Fagen, ed., A History of Engineering and Science in the Bell System (Murray Hill, NJ: Bell
Telephone Laboratories, 1978), 11.
48 Erwin Tomash and Arnold A. Cohen, “The Birth of an ERA: Engineering Research Associates,
Inc., 1946–1955,” Annals of the History of Computing, Vol. 1, No. 2 (1979), 83–97.
49 Julian Bigelow, “Computer Development at the Institute for Advanced Study,” in N. Metropolis,
J. Howlett, and Gian-Carlo Rota, eds., A History of Computing in the Twentieth Century (New
York: Academic Press, 1980), 291–310.
50 Accounting for research and development costs is inherently problematic, for example because of
the sometimes fine line between procurement and development expenditures. Defense Department
accounting practices do not always offer a clearcut distinction between R&D and other budgets. See
Flamm, Targeting the Computer, 94.
20 to 25 percent of the total. The vast bulk of federal research funds at that time
came from military agencies.
In the early 1950s the company-funded share of R&D began to rise (to
about $15 million by 1954), but between 1949 and 1959 the major corporations
developing computer equipment -- IBM, General Electric, Bell Telephone, Sperry
Rand, Raytheon, and RCA -- still received an average of 59 percent of their
funding from the government (again, primarily from military sources). At
Sperry Rand and Raytheon, the government share during this period
approached 90 percent.51 The first commercial production computer, Remington
Rand’s UNIVAC I, embodied the knowledge Eckert and Mauchly had gained
from working on the military-funded ENIAC and later on their BINAC, which
had been built as a guidance computer for Northrop Aircraft’s Snark missile.
Though much of the funding for Eckert and Mauchly’s project was channeled
through the Census Department (which purchased the first UNIVAC I), the
funds were transferred to Census from the Army.52
Flamm also concludes that even when R&D support came primarily from
company sources, it was often the expectation of military procurements that
provided the incentive to invest. For instance, IBM’s first production computer
(the 701, also known as the “Defense Calculator”), first sold in 1953, was
developed at IBM’s expense, but only with letters of intent in hand from
eighteen DoD customers.53
What sort of influence did this military support have on the development of
computers? In chapters 3 and 4 we will explore this question in great detail with
respect to the Whirlwind computer, the SAGE air defense system, the Rand
Corporation, and the Vietnam War. Here, however, I will sketch some more
general answers through a series of examples.
Despite the extraordinary vitality of commercial R&D after the early 1960s,
the Pentagon continued to dominate research funding in certain areas. For
example, almost half of the cost of semiconductor R&D between the late 1950s
and the early 1970s was paid by military sources. Defense users were first to put
56 Jay W. Forrester, “Computation Book,” entry for 8/13/47. Magnetic Core Memory Records,
1932–1977, MC 140, Box 4, F 27. (Institute Archives and Special Collections, Massachusetts Institute
of Technology, Cambridge, MA).
57 Redmond and Smith (Project Whirlwind, 75) quote a letter from Warren Weaver to Mina Rees to
the effect that MIT professor Samuel Caldwell “would work only ‘on research concerning electronic
computing that will freely serve all science,’ a view shared by many of his colleagues.”
58 On Mauchly’s security problems, see the Appendix to Augarten, Bit by Bit.
59 Forman, “Behind Quantum Electronics,” 206.
into service integrated circuits (ICs, the next major hardware advance after
transistors); in 1961, only two years after their invention, Texas Instruments
completed the first IC-based computer under Air Force contract. The Air Force
also wanted the small, lightweight ICs for Minuteman missile guidance control.
In 1965, about one-fifth of all American IC sales went to the Air Force for this
purpose. Only in that year did the first commercial computer to incorporate ICs
appear.60 ICs and other miniaturized electronic components allowed the
construction of sophisticated digital guidance computers that were small, light,
and durable enough to fit into missile warheads. This, in turn, made possible
missiles with multiple, independently-targetable reentry vehicles (MIRVs),
which were responsible for the rapid growth of nuclear destructive potential in
the late 1960s and early 1970s.61 ICs were the ancestors of today’s microprocessors
and very-large-scale integrated circuitry, crucial components of modern cruise
missiles and other “smart” weaponry.
60 Flamm, Creating the Computer, 17–18. Also see Harry Atwater, “Electronics and Computer
Development: A Military History,” Technology and Responsibility, Vol. 1, No. 2 (1982).
61 Ted Greenwood, Making the MIRV (Cambridge: Ballinger, 1975).
62 Frank Rose, Into the Heart of the Mind (New York: Harper and Row, 1984), 36.
We have explored the origin of military support, its extent, and some of its
particular purposes. Now we must return once again to the question of posed in
the chapter title, this time at the level of more general institutional and technical
problems. Why did the American armed forces establish and maintain such an
intimate involvement with computer research?
The most obvious answer comes from the utilitarian side of the vision
captured in General Westmoreland’s “electronic battlefield” speech: computers
can automate and accelerate important military tasks. The speed and complexity
of high-technology warfare have generated control, communications, and
information analysis demands that seem to defy the capacities of unassisted
human beings. Jay Forrester, an MIT engineer who played a major role in
developing the military uses of computing, wrote that between the mid-1940s
and the mid-1950s
63 Jay
W. Forrester, “Managerial Decision Making,” in Martin Greenberger, ed., Computers and the
World of the Future (Cambridge, MA: MIT Press, 1962), 53.
I will argue that this automation theory is largely a retrospective
reconstruction. In the 1940s it was not at all obvious that electronic digital
computers were going to be good for much besides exotic scientific calculations.
Herman Goldstine recalled that well into the 1950s “most industrialists viewed
[digital] computers mainly as tools for the small numbers of university or
government scientists, and the chief applications were thought to be highly
scientific in nature. It was only later that the commercial implications of the
computer began to be appreciated.”64 Furthermore, the field of analog
computation was well developed, with a strong industrial base and a well-
established theoretical grounding. Finally, analog control mechanisms
(servomechanisms) had seen major improvements during the war. They were
readily available, well-understood, and reliable.
Deep theoretical linkages among the three functions were already being
articulated in the communication and information theories of Norbert Wiener
and Claude Shannon. But these theoretical insights did not dictate any particular
path for computer development. Nor did they mandate digital equipment. The
idea of combining the three functions in a single machine, and of having that
machine be an electronic digital computer, came not just from theory -- both
Shannon and Wiener, for example, were also interested in other types of
machines67 -- but from the evolution of practical design projects in social and
cultural context.
67 Shannon, for example, presented an analog maze-solving machine to the eighth Macy
Conference. Heinz von Foerster, ed., Transactions of the Eighth Conference on Cybernetics (New
York: Josiah Macy Jr. Foundation, 1952), 173–180. Also see Shannon, “Computers and Automata,”
Proceedings of the IRE, Vol. 41 (1953), 1234–1241. Norbert Wiener, Cybernetics (Cambridge, MA:
MIT Press, 1948), describes a wide variety of devices; after the war Wiener became deeply
interested in prosthetics.
68 See Jan Rajchman, “Early Research on Computers at RCA,” in Metropolis, Howlett, and Rota, A
History of Computing in the Twentieth Century, 465–469.
69 David Mindell has recently argued that Bush’s Differential Analyzer was not primarily a
calculating engine, but a real-time control system. See David Mindell, “From Machinery to
Information: Control Systems Research at MIT in the 1930s,” paper presented at Society for the
History of Technology Annual Meeting, Lowell, MA, 1994, 18.
70 The future role of digital equipment in communication itself was also far from clear, since almost
all electronic communication technologies after the (digital) telegraph employed (analog)
represented by ENIAC shrank into insignificance when compared with the
wartime program in radar and control systems research, which were primarily
analog technologies, with the result that far more engineers understood analog
techniques than grasped the new ideas in digital computing.
These machines and the social groups centered around them (such as
industrial research laboratories, university engineering schools, and equipment
manufacturers) constituted a major source of resistance to the emerging digital
waveforms, converting sound waves into radio or electrical waves and back again. However, digital
switches -- relays -- were the primary control elements of the telephone network, and a source of
some of the early work on digital computers. As we will see in chapter 8, Claude Shannon’s wartime
work on encryption of voice transmissions produced the first approaches to digitizing sound waves.
71 C. A. Warren, B. McMillan, and B. D. Holbrook, “Military Systems Engineering and Research,”
in Fagen, A History of Engineering and Science in the Bell System, 618.
72 40,000 of the analog 40mm gun directors designed by the Servomechanisms Lab were
manufactured during the war. Its budget, by the war’s end, was over $1 million a year. Wildes and
Lindgren, A Century of Electrical Engineering and Computer Science at MIT, 211.
73 Mina Rees, “The Federal Computing Machine Program,” Science, Vol. 112 (1950), 732. Reprinted
in Annals of the History of Computing, Vol. 7, No. 2 (1985), 156–163.
paradigm, especially when it came to using the new machines for purposes other
than mathematical calculation. In the words of one participant,
Even as late as 1950, among the groups then developing digital machines,
the heritage of World War II analog equipment proved difficult to overcome.
When a Rand team seeking a programmable digital machine toured the
country’s major computer projects, “what [they] found was discouraging.” Many
of the groups working on reliability and high-speed computing were exploring
“modifications of radar technology, which was largely analog in nature. . . . They
were doing all kinds of tweaky things to circuits to make things work. It was all
too whimsical.”75
74 George E. Valley Jr., “How the SAGE Development Began,” Annals of the History of Computing,
Vol. 7, No. 3 (1985), 218.
75 Fred J. Gruenberger, “The History of the JOHNNIAC,” Annals of the History of Computing, Vol.
1, No. 1 (1979), 50.
extremely expensive (by the standards of analog equipment), and they demanded
constant and costly maintenance. Finally, early electronic computers employed
exotic materials and techniques, such as mercury delay line memory and the
cantankerous electrostatic storage tube, which added their own problems to the
issues of cost and reliability.
Even once it became clear (in the late 1940s) that electronic digital
computers would work, could be made reasonably reliable, and could operate at
speeds far outstripping their mechanical and electro-mechanical counterparts,
another issue prevented them from being seriously considered for control
functions. As George Valley, one of the leaders of the SAGE project, pointed out
in a 1985 retrospective, “relatively few wanted to connect computers to the real
world, and these people seemed to believe that the sensory devices would all
yield data. In fact, only some sensors -- such as weighing machines, odometers,
altimeters, the angle-tracking part of automatic tracking radars -- had built-in
counters. Most sensory devices relied on human operators to interpret noisy and
complex signals.”76 The problem lay in designing sensory devices that produced
direct numerical inputs for the computer to calculate with. Analog control
technologies did not require such conversions, because they represented
numerical quantities directly through physical parameters.77
In 1949, according to Valley, “almost all the groups that were realistically
engaged in guiding missiles . . . thought exclusively in terms of analog
computers.”78 A notable exception was the Northrop Snark missile project,
which engaged Eckert and Mauchly to build the BINAC digital computer,
completed in 1949, for its guidance system. However, the BINAC did not work
well, and Northrop engineers afterward moved toward special-purpose digital
differential analyzers -- and away from stored-program general-purpose
computers -- for the project.79
As late as 1960 Albert Jackson, manager of data processing for the TRW
Corporation, could write with authority, in a textbook on analog computation,
Only in the 1980s did efficient digital parallel processing become possible,
motivated in part by precisely this issue of real-time control. Jackson continued:
82 Defense Advanced Research Projects Agency, Strategic Computing (Washington, DC: Defense
Advanced Research Projects Agency, 1983), 8.
83 Jonathan Jacky, “The Strategic Computing Program,” in David Bellin and Gary Chapman, eds.,
Computers in Battle (New York: Harcourt Brace, 1987), 184.
84 For entrées to the large literature on this topic, see Gary Chapman, “The New Generation of
High-Technology Weapons,” in Bellin and Chapman, Computers in Battle, 61–100; Chris Hables
Gray, Computers as Weapons and Metaphors: The U.S. Military 1940–1990 and Postmodern War
(unpublished Ph.D. thesis, University of California, Santa Cruz, 1991); and Morris Janowitz, T h e
Professional Soldier (Glencoe, IL: Free Press, 1960).
computerized solutions.” It was, he wrote, impossible to tell whether the actual
results of such simulated solutions would occur as desired, because
Also in the early 1960s occasional articles in the armed forces journal
Military Review began warning of “electronic despotism” and “demilitarized
soldiers” whose tasks would be automated to the point that the men would be
deskilled and become soft.86 Based on interviews with obviously disaffected
commanders, U.S. News & World Report reported in 1962 -- under the banner
headline “Will ‘Computers’ Run the Wars of the Future?” -- that “military men
no longer call the tunes, make strategy decisions and choose weapons. In the
Pentagon, military men say they are being forced to the sidelines by top civilians,
their advice either ignored or not given proper hearing. . . . In actual defense
operations, military commanders regard themselves as increasingly dependent
on computer systems.”87 While these reports certainly exaggerated the actual
role of computers in military planning and especially in military operations at
the time, their existence shows that the view of computers as a solution to
military problems faced internal opposition from the start. They also
demonstrate how deeply an ideology of computerized command and control had
penetrated into U.S. military culture.
The automation theory alone, then, explains neither the urgency, the
magnitude, nor the specific direction of the U.S. military effort in computing.
Rather than explain how contests over the nature and potential of computers
were resolved, a utilitarian view writes history backwards, using the results of
those contests to account for their origins.
85 Colonel Francis X. Kane, USAF, "Security Is Too Important To Be Left to Computers," Fortune,
Vol. 69, No. 4 (1964), 146–147.
86 Ferdinand Otto Miksche, “The Soldier and Technical Warfare,” Military Review, Vol. 42, No. 8
(1962), 71–78; Major Keith C. Nusbaum, U.S. Army, “Electronic Despotism: A Serious Problem of
Modern Command,” Military Review, Vol. 42, No. 4 (1962), 31–39.
87 “Will ‘Computers’ Run Wars of the Future?,” U.S. News & World Report (April 23, 1962), 44–48.
Nor does a utilitarian view explain the pervasive military fascination
with computers epitomized by General Westmoreland’s speech in the aftermath
of Vietnam. “I see,” he proclaimed, “an Army built into and around an
integrated area control system that exploits the advanced technology of
communications, sensors, fire direction, and the required automatic data
processing -- a system that is sensitive to the dynamics of the ever-changing
battlefield -- a system that materially assists the tactical commander in making
sound and timely decisions.”88 This is the language of vision and technological
utopia, not practical necessity. It represents a dream of victory that is bloodless for
the victor, of battle by remote control, of speed approaching the instantaneous,
and of certainty in decision-making and command. It is a vision of a closed
world, a chaotic and dangerous space rendered orderly and controllable by the
powers of rationality and technology.
Why build computers? In this chapter I have tried to show that not only
the answers, but also the very question, are complex. Their importance to the
future of U.S. military power was by no means obvious at the outset. To
understand how it became so, we must look closely at the intricate chains of
technological advances, historical events, government policies, and emergent
metaphors comprising closed-world discourse. For though policy choices at the
largest levels determined research directions, in some cases quite specifically,
defining digital computation as relevant to national priorities was not itself a
policy issue. Instead it involved a complicated nexus of technological choices,
technological traditions, and cultural values. In fact, digital computer research
itself ended up changing national priorities, as we will see in the following
chapters.