Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

SEE Social and Histroy Passages

Download as pdf or txt
Download as pdf or txt
You are on page 1of 115

READING BOOK THREE

AMERICAN HISTORY
AMERICAN CULTURE

ITS TOPICS COVER FUNDAMENTAL THEMES

PRESENTED IN

HISTORY AND SOCIAL PASSAGES


TOPICS

A New Birth of Freedom the Day of Jubilee

Abolition

Amending the Constitution

American constitution

American freedom

American house of representative

American Jews

American presidency

American senate

Conquering Space

Courtship in Early America

Early Industrialization

Enslavement

Establishing the Confederacy

Facts About the Slavery During the Civil War

Feminism Reborn

History of women’s suffrage in America

Hollywood Today

Integrating the Armed Forces

NASA Technologies Benefit Our Lives

Native Americans

Overview of American Becomes a World Power

Overview of the American Revolution

Overview of the Colonial Era


Overview of the Great Depression

Overview of the Post-War Era

Overview of the Pre-Civil War Era

Overview of the Vietnam War

Overview of World War I

Overview of World War II

Political parties / elections

President Truman Using Atomic Bombs against Japan, 1945

Radical Reform and Antislavery

September 11, 2001

Slave Labor

Slavery's Evolution

The Aftermath of Slavery

The American Renaissance

The Bill of Rights

The Constitution and Slavery

The Emancipation Proclamation

The Equal Rights Amendment

The Fugitive Slave Law

The Great Depression in Global Perspective

The Holocaust

The Kansas-Nebraska Act

The New Hollywood

The New Woman


The Revolution of 1800

The Rise of Antislavery Thoughts

The Space Race

The supreme court

The U.S. Constitution and the Organizations of Government

The Vietnam War

Traditional Family Values and Breakdown of the Family

Voting Rights

What are the origins of American Slavery

Women's Liberation

Women's Rights
Page 1

Facts About the Slavery During the Civil War That are a Must-read

The Civil War is known as the bloodiest war in the history of America, and any discussion on this subject
will not be complete without the mention of slavery. The period of Reconstruction started soon after, and
there was a significant change in the lives of the slaves in the South. It would be wrong to say that after the
end of the Civil War, the slaves were treated as equals, but their condition changed from being slave
laborers to free laborers.
Page 2

The history of slavery in America dates back to the seventeenth century when slaves
were brought to Virginia in 1619. The era of slavery in US can be broadly divided
into three sections,

The Antebellum
Slavery during the Civil War
The Reconstruction

We will be focusing our attention on the lives of slaves during the Civil War - a war
many believe was fought for their emancipation. But before we get an insight into
this subject, it is important to know in brief the events that led to the Civil War.
Page 3

Election Of Abraham Lincoln


Abraham Lincoln was elected the President of United States in 1860, and this
propelled anxiety and fear in the minds of the southern states who believed that the
government will pass laws that will dampen their economy and the 'southern way of
life.' This was primarily because of the reason that northerners hadn't too much at
stake in the institution of slavery.

Reason Why South Got Worried


Their economy chiefly depended on industries and factories. South, on the other
hand, depended on slaves heavily for their work. The plantations of indigo, tobacco,
rice, and cotton (after the invention of cotton gin) required hard labor and the slaves
were made to work for long hours so that profit was maximized.

Confusion About The Civil War


Many people believe that the Civil War was about North's struggle to emancipate the
slaves and South's fight to continue the slave trade. However, it should be
remembered that the North did not go to war to emancipate the slaves, instead
Abraham Lincoln, before becoming the President had explicitly stated that his aim
wasn't to abolish slavery, but to contain its spread.
Page 4

However, the southern states (none of which had voted for Abraham Lincoln)
believed that the election of Abraham Lincoln was detrimental to their economy, and
hence, there was no other option than secession.

First Country To Revolt


South Carolina was the first state to declare secession from the United States in
1861. Other southern states which had considerable stake in the slave labor soon
followed suit and seceded from the Union. These were Mississippi, Georgia,
Alabama, Florida, Louisiana, and Texas. These states came to be known as the
Confederate States.

The First Trigger


The event that precipitated the Civil War was the aggression at Fort Sumter (a fort in
the Southern State of South Carolina) by the Confederacy. This prompted Abraham
Lincoln to call 75,000 volunteers to help the Union in fighting the Confederate
States. As the skirmish was heading to become a full-fledged war, four more
southern states seceded and joined the Confederacy.

These were the states of Virginia, Arkansas, Tennessee, and North Carolina. These
eleven states fought against the twenty-three states of the Union, and thus began
America's bloodiest war which left over 620,000 people dead.
Page 5

Contraband of War
Slaves who ran away from their masters during the onset of Civil War found
sanctuary in the Union border lines, but as per the Fugitive Slave Act, the escaped
slaves had to be returned to their masters. However, in Union controlled Virginia,
General Benjamin Butler came up with a strategy that earned him the title of "Beast
Butler" among the southerners.

General Butler declared escaped slaves as "contraband of war", and it rendered the
Fugitive Slave Act ineffective. This encouraged thousands of slaves to cross over to
the Union side with the assurance that they will be not returned to their masters.

Slaves in the Confederacy


In the south, the army took slaves with them to the frontline to do menial works like
washing, cooking, digging etc. The idea was that by using the slaves for these
chores, the army would be able to use more white men as soldiers. However, this
move boomeranged badly as not only did the soldiers escape on the first
opportunity, but they also provided intelligence inputs to the Union army.

As the Union army (6 million) easily outnumbered the Confederacy (2 million), there
was a desperate need for the south to push more men in the war. Still hesitating to
arm slaves, the responsibility fell on the laymen of the south who worked in
industries and factories.
Page 6

These people neither owned any slaves nor had any interests in the institution, but
they joined the war because they inherently believed that the North must be
defeated in order to ensure that their southern way of life is not hindered.

Slaves Join War


Because of the influx of working white men in the Confederate army, there was a
shortage of laborers in the industries and factories. To counter this, the owners
decided to allow the slaves to fill in for the whites. In the final days of the Civil War,
the Confederates, who had always thought of slaves as inferior, contemplated
allowing them to join the army and fight against the Union. However, before any
progress could be made on this, the Confederates lost the war.

Slave Started To Revolt


Although life was much difficult for slaves on the frontline, the condition of bondsmen
on plantations was not good either. The war had caused a shortage in the supply of
food, and slaves were the first to face the brunt of this deprivation.

One thing that had changed on the plantation was that the slaves had started to
rebel, albeit in a different way. The slaves slowed down their pace of working as
there was a shortage of supervisors on the fields. With a majority of white men
fighting a fratricidal war on the borders, it was left to white women and elderly to rein
in the slaves. But, their attempts at disciplining the soldiers turned futile and many
slaves refused to obey their orders.
Page 7

Difficulty For Lincoln

The problem for Lincoln was that he had not foreseen the emergence of such a large
number of escaped slaves. The Union army, though sophisticated in its war-effort,
lacked the skills to deal with the burgeoning number of slaves. The slaves were
provided with food and shelter, but their life was similar if not worse as it was in the
Southern states. They had to perform the same tasks like digging latrines, building
fortifications, laundry etc.

Although Abraham Lincoln was anti-slavery, to preserve the Union he had to think
about the repercussions abolishment would have on the whole country. A majority of
Northerners were anti-slavery as well, but it won't be right to say that they
considered blacks as equals.

Soldiers serving in the Union army resented the fact that they had to stay away from
their families in such hostile conditions to fight for the rights of slaves. This caused
animosity in their hearts towards the escaped slaves and there were several reports
of ill-treatment.

Lincoln had first thought that the free slaves would be resettled somewhere outside
America, and the government will fund this exercise initially until the slaves develop
their own economy, however, not many northerners were in favor of this proposal
Page 8

and eventually, this plan never materialized.

The Beginning of Emancipation

Although Confederacy had not been recognized as an independent country by any


European power, there were simmering apprehensions among the North that if
something drastic is not done, Confederacy may well gain an advantage by having
the international support with it.

Amidst mounting pressure by abolitionists and international countries, Abraham


Lincoln slowly began the emancipation program. This was a major breakthrough in
the war as till now, the Union war motto was to preserve the Union, but now for the
first time since the war had started, emancipation of slaves took center stage.

On September 22, 1862, Abraham Lincoln issued what is known as the Preliminary
Emancipation Proclamation. It explicitly stated that slaves in the rebel states would
be declared free after 1st January, 1863. The proclamation did nothing to deter the
south from retreating and fighting continued as usual. As promised, Abraham Lincoln
issued the Emancipation Proclamation on 1st January, 1863.

Emancipation Proclamation
Page 9

Excerpt of that day is, on the first day of January, in the year of our Lord one
thousand eight hundred and sixty-three, all persons held as slaves within any State
or designated part of a State, the people whereof shall then be in rebellion against
the United States, shall be then, thenceforward, and forever free; and the Executive
Government of the United States, including the military and naval authority thereof,
will recognize and maintain the freedom of such persons, and will do no act or acts
to repress such persons, or any of them, in any efforts they may make for their
actual freedom.

(Source: archives.gov)

The important thing to note in this proclamation was that it did not speak anything
about the future of slaves in the loyal border states. Historians believe that although
the proclamation was more symbolic than effectual, and it was the stepping stone
towards the abolishment of slavery three years later.

After Emancipation Proclamation


After the announcement of Emancipation Proclamation, pressure had been
mounting on the Union government to allow slaves to fight against the South.
However, this was met with much skepticism as not many people in the north
thought that slaves could be good soldiers. They also thought that by enlisting the
slaves in the army, the world will get the notion that the Union army is not competent
enough to fight the war on its own.

However, the prospect of a seemingly unending war convinced Lincoln that it was
time that the slaves be allowed to fight their oppressors. The Second Confiscation
Page 10

and Militia Act (1862) was passed and it allowed the President to employ slaves to
join the army and help it in suppressing the rebellion.

Many slaves thought that they will now be able to fight the war which started for their
emancipation, but to their dismay, they were confined to service units while the
whites did all the fighting. To make matters worse, the slave soldiers experienced
what is now termed as 'institutional racism'. The white soldiers were paid $13 per
week, but the slave soldiers were paid $10 only. Also, a part of the wage was
deducted for clothing charges, which left the slave soldiers with around $7 per week.

The first significant step in formation of an African-American unit took place in New
Orleans where three National Guards were formed, but, the biggest moment of
triumph and vindication came on July 18, 1863 when the 54th Massachusetts - a unit
comprising mainly African-Americans attacked Fort Wagner.

The Confederacy soldiers inside the fort retaliated and the 54th Massachusetts lost
as many as 600 soldiers. Although, the troops were not successful in conquering
Ford Wagner, the sacrifice and valor of the soldiers proved that the slaves wanted
freedom and they could lay down their life to achieve it. By the time the war was over
in 1865, about 180, 000 black men had served in the Union army. The total casualty
of the slave soldiers was 40,000 out of which 30,000 died of illness and epidemics.
Page 11

Abolishment of Slavery

The Civil War ended in 1865 with the Northern forces under the Union defeating the
southern states fighting under Confederacy. The Congress passed the 13th
amendment which abolished slavery in the United States on January 31, 1865, and
it was ratified by the states on December 6, 1865.

The amendment stated,


Neither slavery nor involuntary servitude, except as a punishment for crime whereof
the party shall have been duly convicted, shall exist within the United States, or any
place subject to their jurisdiction.

American Civil War Facts and Timeline

Interesting Facts about the American Civil War


Page 12

Overview of the Colonial Era

The year 1492 marks a watershed in modern world history. Columbus's voyage of discovery inaugurated a
series of developments that would have vast consequences for both the Old World and the New. It transformed
the diets of both the eastern and western hemispheres, helped initiate the Atlantic slave trade, spread diseases
that had a devastating impact on Indian populations, and led to the establishment of European colonies across
the Western Hemisphere.

This section identifies the factors--including rapid population growth, commerce, new learning, and the rise of
competing nation-states--that encouraged Europeans to explore and colonize new lands. It explains why
Portugal and Spain were the first to become involved in overseas exploration and why England and France were
slow to challenge Spain’s supremacy in the Americas.

Detailed Summary:

European Expansion

During the mid- and late-15th century, Europe gained mastery over the world's ocean currents and wind
patterns and began to create a European-centered world economy. Europeans developed astronomical
instruments and trigonometrical tables to plot the location of the sun and stars; replaced oarsmen with sails;
and began to better understand wind patterns and ocean currents.

The pioneer in European expansion was tiny Portugal, which, after 1385, was a united kingdom, and, unlike
other European countries, was free from internal conflicts. Portugal focused its energies on Africa's western
coast. It was Spain that would stumble upon the New World.

Columbus underestimated the circumference of the earth by one-fourth and believed he could reach Japan by
sailing 2,400 miles west from the Canary Islands. Until his death in 1506 he insisted that he had reached Asia.
But he quickly recognized that the new lands could be a source of wealth from precious minerals and sugar
cane.

The Columbian Exchange

The 15th and 16th century voyages of discovery brought Europe, Africa, and the Americas into direct contact,
producing an exchange of foods, animals, and diseases that scholars call the “Columbian Exchange.”

The Indians taught Europeans about tobacco, corn, potatoes, and varieties of beans, peanuts, tomatoes, and
other crops unknown in Europe. In return, Europeans introduced the Indians to wheat, oats, barley, and rice,
as well as to grapes for wine and various melons. Europeans also brought with them domesticated animals
including horses, pigs, sheep, goats, and cattle.

Even the natural environment was transformed. Europeans cleared vast tracts of forested land and
inadvertently introduced Old World weeds. The introduction of cattle, goats, horses, sheep, and swine also
transformed the ecology as grazing animals ate up many native plants and disrupted indigenous systems of
agriculture. The horse, extinct in the New World for ten thousand years, encouraged many farming peoples to
become hunters and herders.

The exchange, however, was not evenly balanced. Killer diseases killed millions of Indians. The survivors were
drawn into European trading networks that disrupted earlier patterns of life.

European Colonization

There were three distinct forms of European colonization in the New World: empires of conquest, commerce,
Page 13

and settlement. Spain regarded the Indians as a usable labor force, while France treated the Indians primarily
as trading partners. The English, in contrast, adopted a policy known as plantation settlement: the removal of
the indigenous population and its replacement with native English and Scots.

For more than a century, Spain and Portugal were the only European powers with New World colonies. After
1600, however, other European countries began to emulate their example. France’s New World Empire was
based largely on trade. By the end of the 16th century, a thousand French ships a year were engaged in the fur
trade along the St. Lawrence River and the interior, where the French constructed forts, missions, and trading
posts.

Relations between the French and Indians were less violent than in Spanish or English colonies. In part, this
reflected the small size of France’s New World population, totaling just 3,000 in 1663. Virtually all these settlers
were men--mostly traders or Jesuit priests--and many took Indian wives or concubines, helping to promote
relations of mutual dependency. Common trading interests also encouraged accommodation between the
French and the Indians. Missionary activities, too, proved somewhat less divisive in New France than in New
Mexico or New England, since France’s Jesuit priests did not require them to immediately abandon their tribal
ties or their traditional way of life.

English Colonization

During the 17th century, when England established its first permanent colonies in North America, a crucial
difference arose between the southern-most colonies, whose economy was devoted to production of staple
crops, and the more diverse economies of the northern colonies.

Initially, settlers in the Chesapeake colonies of Maryland and Virginia relied on white indentured servants as
their primary labor force, and at least some of the blacks who arrived in the region were able to acquire
property. But between 1640 and 1670, a sharp distinction emerged between short-term servitude for whites
and permanent slavery for blacks. In Virginia, Bacon's Rebellion accelerated the shift toward slavery. By the
end of the century slavery had become the basic labor force in the southern colonies.

In New England, the economy was organized around small family farms and urban communities engaged in
fishing, handicrafts, and Atlantic commerce, with most of the population living in small compact towns. In
Maryland and Virginia, the economy was structured around larger and much more isolated farms and
plantations raising tobacco. In the Carolinas, economic life was organized around larger but less isolated
plantations growing rice, indigo, coffee, cotton, and sugar.

Religious persecution was a particularly powerful force motivating English colonization. Some 30,000 English
Puritans immigrated to New England, while Maryland became a refuge for Roman Catholics, and Pennsylvania,
southern New Jersey, and Rhode Island, havens for Quakers. Refugees from religious persecution included
Baptists, Congregationalists, and Presbyterians, to say nothing of religious minorities from continental Europe,
including Huguenots and members of the Dutch and German Reformed churches.

By 1700, Britain's North American colonies differed from England itself in the population growth rate, the
proportion of white men who owned property and were able to vote, as well as in the population's ethnic and
religious diversity. The early and mid-18th century brought far-reaching changes to the colonies, including a
massive immigration, especially of the Scots-Irish; the forced importation of tens of thousands of enslaved
Africans; and increasing economic stratification in both the northern and southern colonies. A series of religious
revivals known as the Great Awakening helped to generate an American identity that cut across colony lines.

Between 1660 and 1760, England sought to centralize control over its New World Empire and began to impose
a series of imperial laws upon its American colonies. From time to time, when the imperial laws became too
restrictive, the colonists resisted these impositions, and Britain responded with a system of accommodation
known as "salutary neglect."

During the late 17th and early and mid-18th centuries, the colonists became embroiled in a series of contests
for power between Britain, France and Spain. By the 1760s--after Britain had decisively defeated the French--
the colonists were in a position to challenge their subordinate position within the British Empire.
Page 14

What are the origins of American Slavery?

In 1690, one out of every nine families in Boston owned a slave. In New York City, in 1703, two out of every five
families owned a slave. From Newport, Rhode Island to Buenos Aires, black slaves could be found in virtually every
New World area colonized by Europeans.

Black slaves arrived in the New World at least as early as 1502. Over the next three centuries, slave traders brought
at least fifteen million Africans to the New World (another twenty percent or more Africans died during the march to
the West African coast and an additional twenty percent perished during the "middle passage" across the Atlantic
Ocean).

Why beginning in the sixteenth century did Spanish, Portuguese, French, Dutch, Danish, and English colonists all
bring African slaves to their New World colonies? Why did they do something that we find wholly repugnant morally?

Few questions have aroused more bitter debate or evoked more impassioned controversy than the origins of black
slavery. Was it, as some have argued, the product of deep seated racial prejudice? Certainly, there is a great deal of
evidence showing that many Europeans held deeply racist sentiments well before the establishment of the institution
of slavery. We know, for example, that Elizabethan Englishmen associated blackness with evil, death, and danger.
They portrayed the devil as having black skin and associated beauty with fairness of skin. Through their religion, too
Englishmen denigrated Africans, claiming that Negroes were the descendants of Noah's son Ham, who, according to
the Old Testament, was cursed by having black offspring for daring to look upon his father drunk and naked while his
brothers averted their eyes. (In fact, Ham was not the Biblical ancestor of Africans).

Long before the English had much contact with Africans racist stereotypes were already widespread. One English
writer claimed that Negroes were naturally "addicted unto Treason, Treacherie, Murther, Theft and Robberie."
Without a doubt, Englishmen considered Africans an alien and unassimilable people.

Or was black slavery the product of a haphazard and random process that took place gradually with little real sense
of the ultimate outcome? Proponents of this line of argument note that there was nothing inevitable about European
colonists relying upon a black slave labor force. Far from being the result of a conscious plan, the adoption of black
slavery, it is argued, was the resulted of innumerable local and pragmatic choices, reflecting such variables as the
mortality of the native Indian population, the availability of white servants, and the cost of African slaves. In every
English colony, for example, colonists initially relied on white indentured servants for the bulk of their labor needs--
not on black slaves. They finally settled on African slaves because of supply shortages and the threat of revolt among
white indentured servants.

Still others insist that slavery was the product not of racism but the outgrowth of European attitudes toward the poor.
European societies were based on the principle of inequality. Elizabethan Englishmen flogged the poor and forced
them to toil in work houses. Far from finding the idea of slavery repellant, Elizabethan Englishmen accepted the idea,
for example, adopting a statute in 1547 allowing persistent vagabonds to be enslaved and branded with the letter
"S."

Once slavery was introduced, slavery carried far-reaching consequences for the future. By assuming positions
formerly occupied by an underclass of unruly and despised white servants, black slaves helped to create a
remarkably "free" and affluent society of whites, committed to the principles of liberty and equality.
Page 15

Enslavement

Many Americans, influenced by images in the television mini-series "Roots," mistakenly believe that most slaves
were captured by Europeans who landed on the African coast and captured or ambushed people. While Europeans
did engage in some slave raiding, the majority of people who were transported to the Americas were enslaved by
other Africans. It is important to understand that Europeans were incapable, on their own, of kidnapping 20 million
Africans. Indeed, the system became so institutionalized that Europeans had little contact with the actual process of
enslavement.

How were people actually enslaved? Most slaves in Africa were captured in wars or in surprise raids on villages.
Adults were bound and gagged and infants were sometimes thrown into sacks. One of the earliest first-hand
accounts of the African slave trade comes from a seamen named Gomes Eannes de Azurara, who witnessed a
Portuguese raid on an African village. He said that some captives "drowned themselves in the water; others thought
to escape by hiding under their huts; others shoved their children among the seaweed."

The overwhelming majority of slaves sold to Europeans had not been slaves in Africa. They were free people who
were captured in war or were victims of banditry or were enslaved as punishment for certain crimes. In Senegambia,
the Guinea Coast, and the Slave Coasts of West Africa, war was the most important source of slaves. In Angola,
kidnapping and condemnation for debt were very important. In most cases, rulers or merchants were not selling their
own subjects, but people they regarded as alien. We must remember that Africans did not think of themselves as
Africans, but as members of separate nations.

Apologists for the African slave trade long argued that European traders did not enslave anyone: they simply
purchased Africans who had already been enslaved and who otherwise would have been put to death. Thus,
apologists claimed, the slave trade actually saved lives. Such claims represent a gross distortion of the facts. Some
independent slave merchants did in fact stage raids on unprotected African villages and kidnap and enslave Africans.
Most professional slave traders, however, set up bases along the west African coast where they purchased slaves
from Africans in exchange for firearms and other goods. Before the end of the seventeenth century, England, France,
Denmark, Holland, and Portugal had all established slave trading posts on the west African coast.

Yet to simply say that Europeans purchased people who had already been enslaved seriously distorts historical
reality. While there had been a slave trade within Africa prior to the arrival of Europeans, the massive European
demand for slaves and the introduction of firearms radically transformed West and Central African society. A growing
number of Africans were enslaved for petty debts or minor criminal or religious offenses or following unprovoked
raids on unprotected villages. An increasing number of religious wars broke out with the goal of capturing slaves.
European weapons made it easier to capture slaves.

Some African societies -- like Benin in southern Nigeria -- refused to sell slaves. Others, like Dahomey, appear to
have specialized in enslavement. Still other societies, like Asante, in present-day Ghana, and the Yoruba in western
Nigeria, engaged in wars that produced as many as half of all eighteenth and early nineteenth century slaves.

In western and central Africa, many new commercial states, merchants, and traders, chronically short of capital,
thrived by enslaving some people and selling others abroad. Birth rates often exceeded agriculture's capacity to feed
the population. Drought, famine, or periods of violent conflict might lead a ruler or a merchant to sell slaves. In
addition, many rulers sold slaves in order to acquire the trade goods--textiles, alcohol, and other rare imports--that
were necessary to secure the loyalty of their subjects.

In the earliest years, slaves tended to come from coastal areas. Over time, the source moved further into the African
interior. Many Africans retained females and sold off males. About two-thirds of the slaves sent to the New World
were male.

After capture, the captives were bound together at the neck and marched barefoot hundreds of miles to the Atlantic
coast. African captives typically suffered death rates of 20 percent or more while being marched overland. Observers
reported seeing hundreds of skeletons along the slave caravan routes. At the coast, the captives were held in pens
(known as barracoons) guarded by dogs. Our best guess is that another 15 to 30 percent of Africans died during
capture, the march from the interior, or the wait for slave ships along the coast.

The captives who survived the forced march to the sea were then examined by European slave traders: "The
Countenance and Stature, a good Set of Teeth, Pliancy in the Limbs and Joints, and being free of Venereal Taint, are
the things inspected and governs our choice in buying," wrote one slave trader. Those who were bought were
branded with hot irons, assigned numbers, and forced aboard ships; the others were simply abandoned.
Page 16

Slave Labor

It is a mistake to think that slave labor was simply unskilled brutish work. Cultivation of cotton, tobacco, rice, and
sugar requires careful, painstaking effort. On larger plantations, masters relied on slave carpenters, bricklayers,
blacksmiths, wheelwrights, tanners, tailors, butchers, masons, coopers, cabinet makers, metal workers, and
silversmiths. Large numbers also worked as boatmen, waiters, cooks, drivers, housemaids, spinners, and weavers.
At Monticello, Thomas Jefferson worked slave women in a small textile mill and a nail-making factory.

For the vast majority of slaves, however, slavery meant back?breaking field work, planting, cultivating, and
harvesting cotton, hemp, rice, tobacco, or sugar cane. On a typical plantation, slaves worked ten or more hours a
day, "from day clean to first dark," six days a week, with only the Sabbath off. At planting or harvesting time,
planters required slaves to stay in the fields 15 or 16 hours a day. When they were not raising a cash crop, slaves
grew other crops, such as corn or potatoes; cared for livestock; and cleared fields, cut wood, repaired buildings and
fences.

On cotton, sugar, and tobacco plantations, slaves worked together in gangs under the supervision of a supervisor or
a driver. On many rice and hemp plantations and the coastal areas of South Carolina and Georgia, slaveowners
adopted the task system. Slaves had specific daily work assignments, and after they completed their tasks, their
time was their own.

There is a tendency to think of slavery as an economically backward and inefficient institution. But far from being an
economic anachronism, recent research has shown that slavery was highly productive precisely because slaves could
be forced to work far longer hours than free workers. During the peak cotton picking or sugar harvesting seasons, 18
hour days were common.

Sugar plantations were the most innovative economic unit of their time in terms of labor management and
organization. They were the first true factories in the world and anticipated the assembly line and the factory system
in their reliance on such as close supervision and division of tasks.

Older or physically handicapped slaves were put to work in cloth houses, spinning cotton, weaving cloth, and making
clothes. Altogether, masters forced two?thirds of all slaves to work??twice the labor force participation rate among
the free population, reflecting the high proportion of children, women, and the elderly toiling in cotton or rice fields.

Because slaves had no direct incentive to work hard, slaveowners combined harsh penalties with positive incentives.
Some masters denied passes to disobedient slaves. Other confined recalcitrant slaves to private jails. Chains and
shackles were widely used to control runaways. Whipping was a key part of plantation discipline. On one Louisiana
plantation, a slave was lashed every four-and-a-half days.

But physical pain was not enough to elicit hard work. To stimulate productivity, some masters gave slaves small
garden plots and permitted them to sell their produce. Others distributed gifts of food or money at the end of the
year. Still other planters awarded prizes, holidays, and year-end bonuses to particularly productive slaves.

Not all slaves were field hands. During the 1850s, half a million slaves lived in southern towns and cities, where they
worked in textile mills, iron works, tobacco factories, laundries, and shipyards. Others slaves labored as lumberjacks,
as deckhands on riverboats, and in sawmills, gristmills, and quarries. Many slaves were engaged in construction of
roads and railroads. Even on plantations not all slaves were menial laborers. About 250,000 were craftsmen such as
blacksmiths, shoemakers, or carpenters or held domestic posts such as coachmen or house servants.
Page 17

Slavery's Evolution

At the time that Thomas Jefferson was born in 1743 most slaves were born in Africa, few were Christian, and very
few slaves were engaged in raising cotton. Slavery was largely confined to eastern areas near the Atlantic Ocean and
in the Carolinas and Georgia, the slave population was not yet able to reproduce its numbers naturally.

By the time of Jefferson's death in 1826, slavery had changed dramatically. A series of momentous revolutions had
transformed the institution. First, as a result of a demographic revolution, a majority of slaves had been born in the
New World and they were capable of increasing the slave population by natural reproduction. During the seventeenth
century, slaves had had few opportunities to establish stable family relationships. In the Chesapeake colonies and the
Carolinas, two-thirds of all slaves were male, and most slaves lived on plantations with fewer than ten slaves. These
units were so small and widely dispersed, the sex ratio was so skewed, that it was difficult for slaves to find spouses.
A high death rate meant that many slaves did not live long enough to marry or, if they did, their marriages did not
last very long. But by the 1720s in the Chesapeake and the 1750s and '60s in the South Carolina and Georgia Low
Country, the slave population was naturally reproducing its numbers.

Second, the so-called "plantation revolution" not only increased the size of plantations, but made them more
productive and efficent economic units. Planters expanded their operations and imposed more supervision on their
slaves.

A third revolution was religious. During the colonial period, many planters resisted the idea of converting slaves to
Christianity out of a fear that baptism would change a slave's legal status. The black population was virtually
untouched by Christianity until the religious revivals of the 1730s and 1740s. By the early nineteenth century,
slaveholders increasingly adopted the view that Christianity would make slaves more submissive, orderly, and
conscientious and encouraged missionary activities among slaves.

Slaves themselves found in Christianity a faith that could give them hope in an oppressive world. While Christianity
has sometimes been called a "religion of passivity," it proved impossible to purge Christianity of its antislavery
overtones with its promise of deliverance from bondage. Beginning in the 1740s and greatly accelerating in the early
1800s, many slaves converted to Christianity. In general, slaves did not join their masters' churches. Most became
Baptists or Methodists.

A fourth revolution altered the areas in which slaves lived and worked. Between 1790 and 1860, 835,000 slaves were
moved from Maryland, Virginia, and the Carolinas to Alabama, Mississippi, Louisiana, and Texas. We know that slaves
were frequently sold apart from their families or separated from family members when they were moved to the Old
Southwest.

Finally, there was a revolution in moral values and sensibility. For the first time in history, religious and secular
groups denounced slavery as sinful and as a violation of natural rights. During the 1760s, the first movements in
history began to denounce slavery. The earliest groups to oppose slavery were "perfectionist" religious sects like the
Quakers who challenged all traditional authorities and wanted to live free from sin.
Page 18

The Aftermath of Slavery

Except for Haiti, the American South was the only region in the western hemisphere in which slavery was
overthrown
by force of arms. It was the only region, except for Brazil, in which slaveowners received no compensation for the
loss of their slave property; and the only region, except for Gaudeloupe and Martinique, in which former slaves
received civil and political rights. And it was the only postemancipation society in which large slaveowners were
deprived of right to hold public office and in which former slaves formed successful political alliances with whites.

Nevertheless, the abolition of slavery did not mean that former slaves had achieved full freedom. Throughout the
western hemisphere, the end of slavery was followed by a period of reconstruction in which race relations were
redefined and new systems of labor emerged. In former slave societies throughout the Americas, ex-slaves sought to
free themselves from the gang system of labor on plantation and establish small-scale, self-sufficient farms, while
planters or local governments sought to restore the plantation system. The outcome, in many former slave societies,
was the emergence of a caste system of race relations and a system of involuntary or forced labor, such as peonage,
debt bondage, apprenticeship, contract laborers, indentured laborers, tenant farming, and sharecropping.

In every postemancipation society, the abolition of slavery resulted in acute labor problems and declining
productivity, spurring efforts to restore plantation discipline. Even in Haiti, where black revolution had overthrown
slavery, repeated attempts were made to restore the plantation system. On Caribbean islands--like Antigua,
Barbados, and St. Kitts--where land was totally controlled by white planters, the plantation system was reimposed. In
other areas--like British Guiana, Jamaica, or Trinidad where former slaves were able to squat on unsettled land and
set up subsistence farms, staple production fell sharply. To counteract the precipitous decline in sugar production, the
British government imported tens of thousands of "coolie" laborers from China, India, Java, and West Africa into
Guiana, Jamaica, Mauritius, Surinam, and Trinidad. In many plantation societies, governments sought to force former
slaves back to work on plantations with strict vagrancy laws, coercive labor contracts, and regressive taxes.

The story of Reconstruction in the American South echoes that broader concern with labor control. Immediately
following the war, all-white state legislatures passed "black codes" designed to force freed blacks to work on
plantations, where they would be put to work in gangs. These codes denied blacks the right to purchase or even rent
land. Vagrancy laws allowed authorities to arrest blacks "in idleness" and assign them to a chain gang or auction
them off to a planter for as long as a year. Other statutes required blacks to have written proof of employment and
barred blacks from leaving plantations. The Freedmen's Bureau, ostensibly designed to aid former slaves, helped to
enforce laws against vagrancy and loitering and refused to allow ex-slaves to keep land that they had occupied
during the war. One black army veteran asked rhetorically: "If you call this Freedom, what did you call Slavery?"

Such efforts to virtually reenslave the freedmen led Congressional Republicans to seize control of Reconstruction
from President Andrew Johnson, deny representatives from the former Confederate states their Congressional seats,
and pass the Civil Rights Act of 1866 and write the Fourteenth Amendment to the Constitution to extend citizenship
rights to African Americans and guarantee them equal protection of the laws. In 1870, the country went even further
by ratifying the Fifteenth Amendment, which extend voting rights to black men. But the most radical proposal
advanced during Reconstruction--to confiscate plantations and redistribute portions of the land to the freedmen--was
defeated.

The freedman, in alliance with carpetbaggers (Northerners who had migrated to the south following the Civil War)
and southern white Republicans known as scalawags, temporarily gained power in every former Confederate state
except Virginia. Altogether, over 600 blacks served as legislators in reconstruction governments (though blacks
comprised a majority only in the lower house of South Carolina's legislator). The Reconstruction governments drew
up democratic state constitutions, expanded women's rights, provided debt relief, and established the South's first
state-funded schools. During the 1870s, however, internal divisions within the southern Republican party, white
terror, and northern apathy allowed white Democrats, known as Redeemers to return to power in the South's state
governments. The North's failure to enforce the 14th and 15th Amendments permitted racial segregation and
disfranchisement in the South.

During Reconstruction, former slaves--and many small white farmers--became trapped in a new system of economic
exploitation--sharecropping. Lacking capital and land of their own, former slaves were forced to work for large
landowners. Initially, planters, with the support of the Freedmen's Bureau, sought to restore gang labor under the
supervision of white overseers. The freedman, who wanted autonomy and independence, refused to sign contracts
that required gang labor. Ultimately, sharecropping emerged as a sort of compromise.
Page 19

Instead of cultivating land in gangs supervised by overseers, landowners divided plantations into 20- to 50-acre plots
suitable for farming by a single family. In exchange for land, a cabin, and supplies, sharecroppers agreed to raise a
cash crop (usually cotton) and to give half the crop to their landlord. The high interest rates landlords and merchants
charged for goods bought on credit transformed sharecropping into a system of economic dependency and poverty.
The freedmen found that "freedom could make folks proud but it didn't make 'em rich."

Nevertheless, the sharecropping system did allow freedmen a degree of freedom and autonomy greater than that
experienced under slavery. As a symbol of their newly won independence, freedmen had teams of mules drag their
former slave cabins away from the slave quarters into their own fields. Black wives and daughters sharply reduced
their labor in the fields and instead devoted more time to childcare and housework. For the first time, black families
could divide their time between fieldwork and housework in accordance with their own family priorities.

Chattel slavery had been defeated. The gang system of labor, enforced by the whip, was dead. Real gains had been
won. But full freedom remained an unfulfilled promise.

In 1970, the countries of the Arabian peninsula became the last in the world to abolish legal slavery. Nevertheless,
the buying and selling of human beings continues to flourish in many parts of the world. Each year, an estimated one
million Asian women and children, in Bangladesh, Burma, India, Pakistan, Sri Lanka, Thailand, and elsewhere, are
sold or auctioned into slavery to serve as prostitutes or child laborers. Methods of procurement have changed since
the 18th centuries; instead of being kidnapped, slaves are bought in impoverished villages for a few hundred dollars.
But the cruelties of slavery remain, with contemporary slaves chained to beds in brothels or at workbenches in
sweatshops. The final chapter in the history of slavery remains unfinished.
Page 20

Overview of the Pre-Civil War Era

During the early 19th century, and especially after the War of 1812, American society was profoundly
transformed. These years witnessed rapid economic and territorial expansion; the extension of democratic
politics; the spread of evangelical revivalism; the rise of the nation's first labor and reform movements; the
growth of cities and industrial ways of life; radical shifts in the roles and status of women; and deepening
sectional conflicts that would bring the country to the verge of civil war.

This section examines the changes that took place in voting, nominating procedures, party organization, and
campaign strategies between 1820 and 1840; and explains why new political parties emerged in the United
States between the 1820s and the 1850s and how these parties differed in their principles and their bases of
support.

You will learn about the religious, cultural, and social factors that gave rise to efforts to suppress the drinking
of hard liquor; to rehabilitate criminals; establish public schools; care for the mentally ill, the deaf, and the
blind; abolish slavery; and extend women's rights, as well as about the efforts of authors and artists to create
distinctly American forms of literature and art.

In addition, you will read about the Native Americans and Mexicans who lived in the trans-Mississippi West;
about the exploration of the Far West and the forces that drove traders, missionaries, and pioneers westward;
and the way that United States acquired Texas , the Great Southwest, and the Pacific Northwest by
annexation, negotiation, and war.

Finally, you will read about the diverging economic developments that contributed to growing sectional
differences between the North and South, and about the Compromise of 1850, including the Fugitive Slave
Law; the demise of the Whig Party and the emergence of the Republican Party; the Kansas-Nebraska Act;
violence in Kansas; the controversial Supreme Court decision in the case of Dred Scott; and John Brown's raid
on Harpers Ferry.

Summary:

Throughout the Western world, the end of the Napoleonic Wars brought an end to a period of global war and
revolution and the start of a new era of rapid economic growth. For Americans, the end of the War of 1812
unleashed the rapid growth of cities and industry and a torrent of expansion westward. The years following the
war also marked a notable advance of democracy in American politics. Property qualifications for voting and
office holding were abolished; voters began to directly elect presidential electors, state judges, and governors;
and voting participation skyrocketed. In addition, the antebellum era saw a great surge in collective efforts to
improve society through reform. Unprecedented campaigns sought to outlaw alcohol, guarantee women's
rights, and abolish slavery.

Rapid territorial expansion also marked the antebellum period. Between 1845 and 1853, the nation expanded
its boundaries to include Arizona, California, Colorado, Idaho, Nevada, New Mexico, Oregon, Texas, Utah,
Washington, and Wyoming. The United States annexed Texas in 1845; partitioned the Oregon country in 1846
following negotiations with Britain; wrested California and the great Southwest from Mexico in 1848 after the
Mexican War; and acquired the Gadsden Purchase in southern Arizona from Mexico in 1853.

The period's most fateful development was a deepening sectional conflict that brought the country to the brink
of civil war. The addition of new land from Mexico raised the question that would dominate American politics
during the 1850s: whether slavery would be permitted in the western territories. The Compromise of 1850
attempted to settle this issue by admitting California as a free state but allowing slavery in the rest of the
Mexican cession. But enactment of the Fugitive Slave Law as part of the compromise exacerbated sectional
tensions. The question of slavery in the territories was revived by the 1854 decision to open Kansas and
Page 21

Nebraska territories to white settlement and decide the status of slavery according to the principle of popular
sovereignty. Sectional conflict was intensified by the Supreme Court's Dred Scott decision, which declared that
Congress could not exclude slavery from the western territories; by John Brown's raid on Harpers Ferry; and by
Abraham Lincoln's election as president in 1860.

Jacksonian Democracy

The period from 1820 to 1840 was a time of important political developments. Property qualifications for voting
and officeholding were repealed; voting by voice was eliminated. Direct methods of selecting presidential
electors, county officials, state judges, and governors replaced indirect methods. Voter participation increased.
A new two-party system replaced the politics of deference to elites. The dominant political figure of this era was
Andrew Jackson, who opened millions of acres of Indian lands to white settlement, destroyed the Second Bank
of the United States, and denied the right of a state to nullify the federal tariff.

The Roots of American Economic Growth

After the War of 1812, the American economy grew at an astounding rate. The development of the steamboat
by Robert Fulton revolutionized water travel, as did the building of canals. The construction of the Erie Canal
stimulated an economic revolution that bound the grain basket of the West to the eastern and southern
markets. It also unleashed a spurt of canal building. Eastern cities experimented with railroads which quickly
became the chief method of moving freight. The emerging transportation revolution greatly reduced the cost of
bringing goods to market, stimulating both agriculture and industry. The telegraph also stimulated development
by improving communication. Eli Whitney pioneered the method of production using interchangeable parts that
became the foundation of the American System of manufacture. Transportation improvements combined with
market demands stimulated cash crop cultivation. Agricultural production was also transformed by the iron plow
and later the mechanical thresher. Economic development contributed to the rapid growth of cities. Between
1820 and 1840, the urban population of the nation increased by 60 percent each decade.

Religion in the Early Republic

Two currents in religious thought--religious liberalism and evangelical revivalism--had enormous impact on the
early republic. Religious liberalism was an emerging form of humanitarianism that rejected the harsh Calvinist
doctrines of original sin and predestination. Its preachers stressed the basic goodness of human nature and
each individual's capacity to follow the example of Christ. At the same time, enthusiastic religious revivals
swept the nation in the early 19th century. The revivals inspired a widespread sense that the nation was
standing close to the millennium, a thousand years of peace and brotherhood when sin, war, and tyranny would
vanish from the earth. In addition, the growth of other religions--African American Christianity, Catholicism,
Judaism, the Mormon Church--reshaped America's religious landscape.

Pre-Civil War Reform

During the first half of the 19th century, reformers launched unprecedented campaigns to reduce drinking,
establish prisons, create public schools, educate the deaf and the blind, abolish slavery, and extend equal rights
to women. Increasing poverty, lawlessness, violence, and vice encouraged efforts to reform American society.
So, too, did the ideals enshrined in the Declaration of Independence, the philosophy of the Enlightenment, and
liberal and evangelical religion. Reform evolved through three phases. The first phase sought to persuade
Americans to lead more Godly daily lives. Moral reformers battled profanity and Sabbath breaking, attacked
prostitution, distributed religious tracts, and attempted to curb the use of hard liquor. Social reformers sought
to solve the problems of crime and illiteracy by creating prisons, public schools, and asylums for the deaf, the
blind, and the mentally ill. Radical reformers sought to abolish slavery and eliminate racial and gender
discrimination and create ideal communities as models for a better world.

Pre-Civil War American Culture

At the end of the 18th century, the United States had few professional writers or artists and lacked a class of
patrons to subsidize the arts. But during the decades before the Civil War, distinctively American art and
literature emerged. In the 1850s, novels appeared by African-American and Native American writers. Mexican-
Americans and Irish immigrants also contributed works on their experiences. Beginning with historical paintings
of the American Revolution, artists attracted a large audience. Landscape painting also proved popular. An
indigenous popular culture also emerged between 1800 and 1860, consisting of penny newspapers, dime
novels, and minstrel shows.
Page 22

Westward Expansion

Until 1821, Spain ruled the area that now includes Arizona, California, Colorado, Nevada, New Mexico, Texas,
and Utah. The Mexican war for independence opened the region to American economic penetration.
Government explorers, traders, and trappers helped to open the West to white settlement. In the 1820s,
thousands of Americans moved into Texas, and during the 1840s, thousands of pioneers headed westward
toward Oregon and California, seeking land and inspired by manifest destiny, the idea that America had a
special destiny to stretch across the continent. Between 1844 and 1848 the United States expanded its
boundaries into Texas, the Southwest, and the Pacific Northwest. It acquired Texas by annexation; Oregon and
Washington by negotiation with Britain; and Arizona, California, Colorado, Idaho, Nevada, New Mexico, Oregon,
Utah, and Wyoming as a result of war with Mexico.

The Pre-Civil War South

In the decades before the Civil War, northern and southern development followed increasingly different paths.
By 1860, the North contained 50 percent more people than the South. It was more urbanized and attracted
many more European immigrants. The northern economy was more diversified into agricultural, commercial,
manufacturing, financial, and transportation sectors. In contrast, the South had smaller and fewer cities and a
third of its population lived in slavery. In the South, slavery impeded the development of industry and cities
and discouraged technological innovation. Nevertheless, the South was wealthy and its economy was rapidly
growing. The southern economy largely financed the Industrial Revolution in the United States, and stimulated
the development of industries in the North to service southern agriculture.

The Impending Crisis

For forty years, attempts were made to resolve conflicts between North and South. The Missouri Compromise
prohibited slavery in the northern half of the Louisiana Purchase. The acquisition of vast new territories during
the 1840s reignited the question of slavery in the western territories. The Compromise of 1850 was an attempt
to solve this problem by admitting California as a free state but allowing slavery in the rest of the Southwest.
But the compromise included a fugitive slave law opposed by many Northerners. The Kansas-Nebraska Act
proposed to solve the problem of status there by popular sovereignty. But this led to violent conflict in Kansas
and the rise of the Republican Party. The Dred Scott decision eliminated possible compromise solutions to the
sectional conflict and John Brown's raid on Harpers Ferry convinced many Southerners that a majority of
Northerners wanted to free the slaves and incite race war.
Page 23

Overview of the Post-War Era

In 1945, the United States was a far different country than it subsequently became. Nearly a third of
Americans lived in poverty. A third of the country's homes had no running water, two-fifths lacked flushing
toilets, and three-fifths lacked central heating. More than half of the nation's farm dwellings had no electricity.
Most African Americans still lived in the South, where racial segregation in schools and public accommodations
were still the law. The number of immigrants was small as a result of immigration quotas enacted during the
1920s. Shopping malls had not yet been introduced.

Following World War II, the United States began an economic boom that brought unparalleled prosperity to a
majority of its citizens and raised Americans expectations, breeding a belief that most economic and social
problems could be solved. Among the crucial themes of this period were the struggle for equality among
women and minorities, and the backlash that these struggles evoked; the growth of the suburbs, and the shift
in power from the older industrial states and cities of the Northeast and upper Midwest to the South and West;
and the belief that the U.S. had the economic and military power to maintain world peace and shape the
behavior of other nations.

The Cold War

After World War II, the United States clashed with the Soviet Union over such issues as the Soviet dominance
over Eastern Europe, control of atomic weapons, and the Soviet blockade of Berlin. The establishment of a
Communist government in China in 1949 and the North Korean invasion of South Korea in 1950 helped
transform the Cold War into a global conflict. The United States would confront Communism in Iran,
Guatemala, Lebanon, and elsewhere. In an atmosphere charged with paranoia and anxiety, there was deep
fear at home about “enemies within” sabotaging U.S. foreign policy and passing atomic secrets to the Soviets.

Postwar America

During the early 1970s, films like American Graffiti and television shows like “Happy Days” portrayed the
1950s as a carefree era--a decade of tail-finned Cadillacs, collegians stuffing themselves in phone booths, and
innocent tranquility and static charm. In truth, the post-World War II period was an era of intense anxiety and
dynamic, creative change. During the 1950s, African Americans quickened the pace of the struggle for equality
by challenging segregation in court. A new youth culture emerged with its own form of music--rock ‘n' roll.
Maverick sociologists, social critics, poets, and writers--conservatives as well as liberals--authored influential
critiques of American society.
Page 24

Overview of the American Revolution

Much more than a revolt against British taxes and trade regulations, the American Revolution was the first modern
revolution. It marked the first time in history that a people fought for their independence in the name of certain
universal principles such as rule of law, constitutional rights, and popular sovereignty.

This section examines the causes, fighting, and consequences of the American Revolution. You will read about the
problems created by the Seven Years' War, and British efforts to suppress American smuggling, to prevent warfare
with Indians, and to pay the cost of stationing troops in the colonies. You will also read about the emerging patterns
of resistance in the colonies, including petitions, pamphlets, intimidation, boycotts, and intercolonial meetings. You
will also learn about the series of events, including the Boston Massacre, the Boston Tea Party, and the Coercive Acts,
that ruptured relations between Britain and its American colonies.

In addition, you will learn why many colonists hesitated before declaring independence and how the Declaration of
Independence summarized colonial grievances and provided a vision of a future independent American republic. This
chapter will discuss the composition of the British and American military forces; the Revolution's implications for the
institution of slavery; and the role of the French, Spanish, Dutch, and Native Americans in the colonists' struggle for
independence. Finally, you will learn why the Americans emerged victorious in the Revolution.

Summary:

The Causes of the Revolution

The roots of the American Revolution can be traced to the year 1763 when British leaders began to tighten imperial
reins. Once harmonious relations between Britain and the colonies became increasingly conflict-riven. Britain’s land
policy prohibiting settlement in the West irritated colonists as did the arrival of British troops. The most serious
problem was the need for money to support the empire.

Attempts through the Sugar Act, the Stamp Act, and the Townshend Acts to raise money rather than control trade
met with growing resistance in the colonies. Tensions increased further after Parliament passed the Coercive Acts and
the First Continental Congress took the first steps toward independence from Britain. Before the colonies gained
independence, they had to fight a long and bitter war.

The Revolutionary War

The British had many advantages in the war, including a large, well-trained army and navy and many Loyalists who
supported the British Empire. But many white colonists were alienated by Lord Dunmore’s promise of freedom to
slaves who joined the royal army, and were inspired by Thomas Paine’s Common Sense.

Excellent leadership by George Washington; the aid of such European nations as France; and tactical errors by British
commanders contributed to the American victory. British strategy called for crushing the rebellion in the North.
Several times the British nearly defeated the Continental Army. But victories at Trenton and Princeton, N.J., in late
1776 and early 1777 restored patriot hopes, and victory at Saratoga, N.Y., which halted a British advance from
Canada, led France to intervene on behalf of the rebels.

In 1778, fighting shifted to the South. Britain succeeded in capturing Georgia and Charleston, S.C. and defeating an
American army at Camden, S.C. But bands of patriots harassed loyalists and disrupted supply lines, and Britain failed
to achieve control over the southern countryside before advancing northward to Yorktown, Va. In 1781, an American
and French force defeated the British at Yorktown in the war's last major battle.

Consequences:

1. About 7,200 Americans died in battle during the Revolution. Another 10,000 died from disease or exposure and
about 8,500 died in British prisons.
2. . A quarter of the slaves in South Carolina and Georgia escaped from bondage during the Revolution. The Northe
states outlawed slavery or adopted gradual emancipation plans.

3. . The states adopted written constitutions that guaranteed religious freedom, increased the legislature's size a
powers, made taxation more progressive, and reformed inheritance laws.
Page 25

The Kansas-Nebraska Act

In 1854, a piece of legislation was introduced in Congress that shattered all illusions of sectional peace. The Kansas-
Nebraska Act destroyed the Whig Party, divided the Democratic Party, and created the Republican Party. Ironically,
the author of this legislation was Senator Stephen A. Douglas, who had pushed the Compromise of 1850 through
Congress and who had sworn after its passage that he would never make a speech on the slavery question again.

As chairman of the Senate Committee on Territories, Douglas proposed that the area west of Iowa and Missouri--
which had been set aside as a permanent Indian reservation--be opened to white settlement. Southern members of
Congress demanded that Douglas add a clause specifically repealing the Missouri Compromise, which would have
barred slavery from the region. Instead, the status of slavery in the region would be decided by a vote of the region's
settlers. In its final form, Douglas's bill created two territories, Kansas and Nebraska, and declared that the Missouri
Compromise was "inoperative and void." With solid support from Southern Whigs and Southern Democrats and the
votes of half of the Northern Democratic members of Congress, the measure passed.

Why did Douglas risk reviving the slavery question? His critics charged that the Illinois Senator's chief interest was to
win the Democratic presidential nomination in 1860 and secure a right of way for a transcontinental railroad that
would make Chicago the country's transportation hub.

Douglas's supporters pictured him as a proponent of western development and a sincere believer in popular
sovereignty as a solution to the problem of slavery in the western territories. Douglas had long insisted that the
democratic solution to the slavery issue was to allow the people who actually settled a territory to decide whether
slavery would be permitted or forbidden. Popular sovereignty, he believed, would allow the nation to "avoid the
slavery agitation for all time to come."

In fact, by 1854 the political and economic pressure to organize Kansas and Nebraska had become overwhelming.
Midwestern farmers agitated for new land. A southern transcontinental rail route had been completed through the
Gadsden Purchase in December 1853, and promoters of a northern railroad route for a viewed territorial organization
as essential. Missouri slaveholders, already bordered on two sides by free states, believed that slavery in their state
was doomed if they were surrounded by a free territory.
Page 26

The Rise of Antislavery Thoughts

The original opponents of slavery were deeply religious women and men who believed that slavery was sinful. Most
of the earliest critics of slavery were Quakers. The Society of Friends, as the group was formally known, was a
religious denomination that had arisen during England's civil war of the mid-1600s. They wanted to live free of sin,
and condemned war, and refused to bear arms, take oaths, or bow or take off their hats to social superiors.
Rejecting an ordained ministry, the Quakers believed that the Holy Spirit was present in every human heart.

Compared to other religious sects of the time, the Quakers were extraordinarily egalitarian. Quaker women assumed
ministerial role and Quakers rejected the notion that infants were born sinful.

Widespread Quaker opposition to slavery arose during the Seven Years' War (1756-1763), when many Friends were
persecuted for refusing to fight or pay taxes. Many members of the group responded to persecution by asserting the
duty of individual Quakers to confront evil. As a result, a growing number of Quakers began to take active steps
against poverty, the drinking of hard liquor, unjust Indian policies, and, above all, slavery. During the 1750s, 1760s,
and 1770s, the Quakers became the first organization in history to prohibit slaveholding.

The very first antislavery petition in the New World was drafted in 1688 by Dutch-speaking Quakers who lived in
Germantown, Penn. Their ancestors had been tortured and persecuted for their religious beliefs, and they saw a
striking similarity between their ancestors' sufferings and the sufferings of slaves. They charged that Africans had
been seized illegally from their homelands, shipped across the Atlantic against their will, and sold away from their
families.

In 1688, the Germantown Quakers stood alone in their protests against slavery. They passed their petition on to
other Quakers in Pennsylvania, only to see their protest against slavery ignored.

Read the following excerpt from the Germantown Quaker Petition of 1688 and identify the reasons why they opposed
slavery:

"There is a saying, that we should do to all men like as we will be done ourselves; making no difference
of what generation, descent, or colour they are.... To bring men hither [to America], or to rob and sell
them against their will, we stand against. In Europe there are many oppressed for conscience-sake; and
here there are those oppressed which are of a black colour....Pray, what thing in the world can be done
worse towards us, than if men should rob or steal us away, and sell us for slaves to strange countries;
separating husbands from their wives and children."
Page 27

Establishing the Confederacy

In early February 1861, the states of the lower South established a new government, the Confederate States of
America, in Montgomery, Alabama, and drafted a constitution. Although modeled on the U.S. Constitution, this
document specifically referred to slavery, state sovereignty, and God. It explicitly guaranteed slavery in the states
and territories, but prohibited the international slave trade. It also limited the President to a single six-year term,
gave the President a line-item veto, required a two-thirds vote of Congress to admit new states, and prohibited
protective tariffs and government funding of internal improvements.

As President, the Confederates selected former U.S. Senator and Secretary of War Jefferson Davis (1808-1889). The
Alabama secessionist William L. Yancey (1814-1863) introduced Davis as Confederate President by declaring: "The
man and the hour have met. Prosperity, honor, and victory await his administration."

At first glance, Davis seemed much more qualified to be President than Lincoln. Unlike the new Republican President,
who had no formal education, Davis was a West Point graduate. And while Lincoln had only two weeks of military
experience, as a militia captain, without combat experience in the Black Hawk War, Davis had served as a regimental
commander during the Mexican War. In office, however, Davis's rigid, humorless personality; his poor health; his
inability to delegate authority; and, above all, his failure to inspire confidence in his people would make him a far less
effective chief executive than Lincoln. During the war, a southern critic described Davis as "false and
hypocritical...miserable, stupid, one-eyed, dyspeptic, arrogant...cold, haughty, peevish, narrow-minded, pig-headed,
[and] malignant."

Following secession, the Confederate states attempted to seize federal property within their boundaries, including
forts, customs houses, and arsenals. Several forts, however, remained within Union hands, including Fort Pickens in
Pensacola, Florida, and Fort Sumter in Charleston, South Carolina's harbor.
Page 28

Abolition

The growth of public opposition to slavery represents one of the most momentous moral transformations in history.
As late as 1750, no church condemned slave ownership or slave trading. Britain, Denmark, France, Holland,
Portugal,
and Spain all openly participated in the slave trade. Beginning with the Quakers in the late 1750s, however,
organized opposition to slavery quickly grew. In 1787, the Northwest Ordinance barred slavery from the territories
north of the Ohio River; by 1804, the nine states north of Delaware had freed slaves or adopted gradual
emancipation plans. In Haiti in 1791, nearly a half million slaves emancipated themselves by insurrection and
revolutionary struggle. In 1807, Britain and the United States outlawed the African slave trade.

The wars of national liberation in Spanish America ended slavery in Spain's mainland New World empire. In 1821,
the region that now includes Ecuador, Colombia, and Venezuela adopted a gradual emancipation plan. Two years
later, Chile agreed to emancipate its slave. In 1829, Mexico abolished slavery.

In 1833, Britain emancipated 780,000 slaves, paying 20 million pounds sterling compensation to their owners. In
1848, Denmark and France freed slaves in their colonial empires. Slavery survived in Surinam and other Dutch New
World colonies until 1863 and in the United States in 1865. The last New World slaves were emancipated in Cuba in
1886 and in Brazil in 1888.

Within the span of a century and a half, slavery, long regarded as an inevitable part of the social order, came to be
seen as a violation of Christian morality and the natural, inalienable rights of man. The main impetus behind
antislavery came from religion. New religious and humanitarian values contributed to a view of slavery as "the sum
of all villainies," a satanic institution which gave rise to every imaginable sin: violence, despotism, racial prejudice,
and sexual corruption. Initially, many opponents of slavery supported "colonization" -- the deportation of black
Americans to Africa, the Caribbean, or Central America. But by the late 1820s, it was obvious that colonization was a
wholly impractical solution to the problem of slavery. Each year the nation's slave population rose by 50,000, but in
1830, the American Colonization Society persuaded just 259 free blacks to migrate to Liberia, bringing the total
number of blacks colonized in Africa to just 1,400.

In 1829, a 25-year-old white Bostonian named William Lloyd Garrison denounced colonization as a cruel hoax
designed to promote the racial purity of the North while doing nothing to end slavery in the South. He demanded
"immediate emancipation" of slaves without compensation to their owners. Within six years, 200 antislavery societies
had sprouted up in the North, and had mounted a massive propaganda campaign against slavery.

The growth of militant abolitionism provoked a harsh public reaction. Mobs led by "gentlemen of property and
standing" attacked the homes and businesses of abolitionist merchants, destroyed abolitionist printing presses,
attacked black neighborhoods, and murdered the Reverend Elijah P. Lovejoy, the editor of an abolitionist newspaper.
In the face of vicious attacks, the antislavery movement divided over questions of strategy and tactics. Radicals, led
by Garrison, began to attack all forms of inequality and violence in American society, withdrew from churches that
condoned slavery, demanded equal rights for women, and called for voluntary dissolution of the Union. Other
abolitionists turned to politics as the most promising way to end slavery, helping to form the Liberty Party in 1840,
the Free Soil party in 1848, and the Republican party in 1854.

By the late 1850s, a growing number of northerners were convinced that slavery posed an intolerable threat to free
labor and civil liberties. Many believed that an aggressive Slave Power had seized control of the federal government,
incited revolution in Texas and war with Mexico, and was engaged in a systematic plan to extend slavery into the
western territories. At the same time, an increasing number of southerners believed that antislavery radicals
dominated northern politics and sought to bar slavery from the western territories and to undermine the institution in
the states where it already existed. John Brown's raid on the federal arsenal at Harpers Ferry in October 1859
produced shock waves throughout the South, producing fears of slave revolt and race war. When Abraham Lincoln
was elected in 1860, many white southerners were convinced that this represented the triumph of abolitionism in the
North and thought they had no choice but to secede from the Union. The new president, however, was passionately
committed to the preservation of the union, and peaceful secession proved to be impossible.
Page 29

The Emancipation Proclamation

In July 1862, about two months before President Lincoln issued the preliminary Emancipation Proclamation,
Congressa second Confiscation Act calling for the seizure of the property of slaveholders who were actively engaged
adopted
in the rebellion. It seems unlikely that this act would have freed any slaves, since the federal government would
have to prove that individual slaveholders were traitors. (In fact, one of the largest slaveholders in South Carolina
was a Baltimore Unionist). Lincoln felt that Congress lacked the legal authority to emancipate slaves; he believed
that only the President acting as commander-in-chief had the authority to abolish slavery.

On September 22, 1862, less than a week after the Battle of Antietam, President Lincoln met with his cabinet. As one
cabinet member, Samuel P. Chase, recorded in his diary, the President told them that he had "thought a great deal
about the relation of this war to Slavery":

You all remember that, several weeks ago, I read to you an Order I had prepared on this subject, which,
since then, my mind has been much occupied with this subject, and I have thought all along that the
time for acting on it might very probably come. I think the time has come now. I wish it were a better
time. I wish that we were in a better condition. The action of the army against the rebels has not been
quite what I should have best liked. But they have been driven out of Maryland, and Pennsylvania is no
longer in danger of invasion. When the rebel army was at Frederick, I determined, as soon as it should
be driven out of Maryland, to issue a Proclamation of Emancipation such as I thought most likely to be
useful. I said nothing to any one; but I made the promise to myself, and (hesitating a little)--to my
Maker. The rebel army is now driven out, and I am going to fulfill that promise.

The preliminary Emancipation Proclamation that President Lincoln issued on September 22 stated that all slaves in
designated parts of the South on January 1, 1863, would be freed. The President hoped that slave emancipation
would undermine the Confederacy from within. Secretary of the Navy Gideon Welles reported that the President told
him that freeing the slaves was "a military necessity, absolutely essential to the preservation of the Union....The
slaves [are] undeniably an element of strength to those who [have] their service, and we must decide whether that
element should be with us or against us."

Fear of foreign intervention in the war also influenced Lincoln to consider emancipation. The Confederacy had
assumed, mistakenly, that demand for cotton from textile mills would lead Britain to break the Union naval blockade.
Nevertheless, there was a real danger of European involvement in the war. By redefining the war as a war against
slavery, Lincoln hoped to generate support from European liberals.

Even before Lincoln issued the Emancipation Proclamation, Postmaster General Montgomery Blair (1813-1883), a
former Democrat from Maryland, had warned the President that this decision might stimulate antiwar protests among
northern Democrats and cost the administration the fall 1862 elections. In fact, Peace Democrats did protest against
the proclamation and Lincoln's assumption of powers not specifically granted by the Constitution. Among the
"abuses" they denounced were his unilateral decision to call out the militia to suppress the "insurrection," impose a
blockade of southern ports, expand the army beyond the limits set by law, spend federal funds without prior
congressional authorization, and suspend the writ of habeas corpus (the right of persons under arrest to have their
case heard in court). The Lincoln administration imprisoned about 13,000 people without trial during the war, and
shut Democratic newspapers in New York, Philadelphia, and Chicago for varying amounts of time.

The Democrats failed to gain control of the House of Representatives in the Fall 1862 election, in part because the
preliminary Emancipation Proclamation gave a higher moral purpose to the northern cause
Page 30

Radical Reform and Antislavery

The initial thrust of reform--moral reform--was to rescue the nation from infidelity and intemperance. A second line
of reform, social or humanitarian reform, attempted to alleviate such sources of human misery as crime, cruelty,
disease, and ignorance. A third line of reform, radical reform, sought national regeneration by eliminating slavery
and racial and sexual discrimination.

Early Antislavery Efforts

As late as the 1750s, no church had discouraged its members from owning or trading in slaves. Slaves could be
found in each of the 13 American colonies, and before the American Revolution, only the colony of Georgia had
temporarily sought to prohibit slavery (because its founders did not want a workforce that would compete with the
convicts they planned to transport from England).

By the beginning of the 19th century, however, protests against slavery had become widespread. By 1804 nine
states north of Maryland and Delaware had either emancipated their slaves or adopted gradual emancipation plans.
Both the United States and Britain in 1808 outlawed the African slave trade.

The emancipation of slaves in the northern states and the prohibition against the African slave trade generated
optimism that slavery was dying. Congress in 1787 had barred slavery from the Old Northwest, the region north of
the Ohio River to the Mississippi River. The number of slaves freed by their masters had risen dramatically in the
upper South during the 1780s and 1790s, and more antislavery societies had been formed in the South than in the
North. At the present rate of progress, predicted one religious leader in 1791, within 50 years it will “be as shameful
for a man to hold a Negro slave, as to be guilty of common robbery or theft.”

By the early 1830s, however, the development of the Cotton Kingdom proved that slavery was not on the road to
extinction. Despite the end of the African slave trade, the slave population continued to grow, climbing from 1.5
million in 1820 to over 2 million a decade later.

A widespread belief that blacks and whites could not coexist and that racial separation was necessary encouraged
futile efforts at deportation and overseas colonization. In 1817 a group of prominent ministers and politicians formed
the American Colonization Society to resettle free blacks in West Africa, encourage planters voluntarily to emancipate
their slaves, and create a group of black missionaries who would spread Christianity in Africa. During the 1820s,
Congress helped fund the cost of transporting free blacks to Africa.

A few blacks supported African colonization in the belief that it provided the only alternative to continued degradation
and discrimination. Paul Cuffe (1759–1817), a Quaker sea captain who was the son of a former slave and an Indian
woman, led the first experiment in colonization. In 1815 he transported 38 free blacks to the British colony of Sierra
Leone, on the western coast of Africa, and devoted thousands of his own dollars to the cause of colonization. In 1822
the American Colonization Society established the colony of Liberia, in west Africa, for resettlement of free American
blacks.

It soon became apparent that colonization was a wholly impractical solution to the nation’s slavery problem. Each
year the nation’s slave population rose by roughly 50,000, but in 1830 the American Colonization Society succeeded
in persuading only 259 free blacks to migrate to Liberia, bringing the total number of blacks colonized in Africa to a
mere 1,400.

The Rise of Abolitionist Sentiment in the North

Initially, free blacks led the movement condemning colonization and northern discrimination against African
Americans. As early as 1817, more than 3,000 members of Philadelphia’s black community staged a protest against
colonization, at which they denounced the policy as “little more merciful than death.” In 1829 David Walker (1785–
1830), the free black owner of a second-hand clothing store in Boston, issued the militant Appeal to the Colored
Citizens of the World. The appeal threatened insurrection and violence if calls for the abolition of slavery and
improved conditions for free blacks were ignored. The next year, some 40 black delegates from 8 states held the first
of a series of annual conventions denouncing slavery and calling for an end to discriminatory laws in the northern
states.

The idea of abolition received impetus from William Lloyd Garrison (1805–1879). In 1829 the 25-year-old white
Bostonian added his voice to the outcry against colonization, denouncing it as a cruel hoax designed to promote the
Page 31

racial purity of the northern population while doing nothing to end slavery in the South. Instead, he called for
“immediate emancipation.” By immediate emancipation, he meant the immediate and unconditional release of slaves
from bondage without compensation to slaveowners.

In 1831, Garrison founded The Liberator, a militant abolitionist newspaper that was the country’s first publication to
demand an immediate end to slavery. On the front page of the first issue, he defiantly declared: “I will not
equivocate--I will not excuse--I will not retreat a single inch--AND I WILL BE HEARD.” Incensed by Garrison’s
proclamation, the state of Georgia offered a $5,000 reward to anyone who brought him to the state for trial.

Within 4 years, 200 antislavery societies had appeared in the North. In a massive propaganda campaign to proclaim
the sinfulness of slavery, they distributed a million pieces of abolitionist literature and sent 20,000 tracts directly to
the South.

Abolitionist Arguments and Public Reaction

Abolitionists attacked slavery on several grounds. Slavery was illegal because it violated the principles of natural
rights to life and liberty embodied in the Declaration of Independence. Justice, said Garrison, required that the nation
“secure to the colored population...all the rights and privileges that belong to them as men and as Americans.”
Slavery was sinful because slaveholders, in the words of abolitionist Theodore Weld, had usurped “the prerogative of
God.” Masters reduced a “God-like being” to a manipulable “THING.” Slavery also encouraged sexual immorality and
undermined the institutions of marriage and the family. Not only did slave masters sexually abuse and exploit slave
women, abolitionists charged, but in some older southern states, such as Virginia and Maryland, they bred slaves for
sale to the more recently settled parts of the Deep South.

Slavery was economically retrogressive, abolitionists argued, because slaves, motivated only by fear, did not exert
themselves willingly. By depriving their labor force of any incentive for performing careful and diligent work, by
barring slaves from acquiring and developing productive skills, planters hindered improvements in crop and soil
management. Abolitionists also charged that slavery impeded the development of towns, canals, railroads, and
schools. Antislavery agitation provoked a harsh public reaction in both the North and the South. The U.S. postmaster
general refused to deliver antislavery tracts to the South. In each session of Congress between 1836 and 1844 the
House of Representatives adopted gag rules allowing that body automatically to table resolutions or petitions
concerning the abolition of slavery.

Mobs led by “gentlemen of property and standing” attacked the homes and businesses of abolitionist merchants,
destroyed abolitionist printing presses, disrupted antislavery meetings, and terrorized black neighborhoods. Crowds
pelted abolitionist reformers with eggs and even stones. During antiabolitionist rioting in Philadelphia in October
1834, a white mob destroyed 45 homes in the city’s black community. A year later, a Boston mob dragged Garrison
through the streets and almost lynched him before authorities removed him to a city jail for his own safety. In 1837,
the abolitionist movement acquired its first martyr when an antiabolitionist mob in Alton, Illinois, murdered Reverend
Elijah Lovejoy, an editor of a militant abolitionist newspaper. Three times mobs destroyed Lovejoy’s printing presses
and attacked his house. When a fourth press arrived, Lovejoy armed himself and guarded the new press at the
warehouse. The antiabolitionist mob set fire to the warehouse, shot Lovejoy as he fled the building, and dragged his
mutilated body through the town.

Division Within the Antislavery Movement

Questions over strategy and tactics divided the antislavery movement. At the 1840 annual meeting of the American
Anti-Slavery Society in New York, abolitionists split over such questions as women’s right to participate in the
administration of the organization and the advisability of nominating abolitionists as independent political candidates.
Garrison won control of the organization, and his opponents promptly walked out. From this point on, no single
organization could speak for abolitionism.

One group of abolitionists looked to politics as the answer to ending slavery and founded political parties for that
purpose. The Liberty Party, founded in 1839 under the leadership of Arthur and Lewis Tappan, wealthy New York City
businessmen, and James G. Birney, a former slaveholder, called on Congress to abolish slavery in the District of
Columbia, end the interstate slave trade, and cease admitting new slave states to the Union. The party also sought
the repeal of local and state “black laws” in the North, which discriminated against free blacks, much as segregation
laws would in the post-Reconstruction South. The Liberty Party nominated Birney for president in 1840 and again in
1844. Although it gathered fewer than 7,100 votes in its first campaign, it polled some 62,000 votes 4 years later and
captured enough votes in Michigan and New York to deny Henry Clay the presidency.

In 1848 antislavery Democrats and Whigs merged with the Liberty Party to form the Free Soil Party. Unlike the
Page 32

Liberty Party, which was dedicated to the abolition of slavery and equal rights for African Americans, the Free Soil
Party narrowed its demands to the abolition of slavery in the District of Columbia and the exclusion of slavery from
the federal territories. The Free Soilers also wanted a homestead law to provide free land for western settlers, high
tariffs to protect American industry, and federally sponsored internal improvements. Campaigning under the slogan
“free soil, free speech, free labor, and free men,” the new party polled 300,000 votes (or 10 percent) in the
presidential election of 1848 and helped elect Whig candidate Zachary Taylor.

Other abolitionists, led by Garrison, took a more radical direction, advocating civil disobedience and linking
abolitionism to other reforms such as women’s rights, world government, and international peace. Garrison and his
supporters established the New England Non-Resistance Society in 1838. Members refused to vote, to hold public
office, or to bring suits in court. In 1854 Garrison attracted notoriety by publicly burning a copy of the Constitution,
which he called “a covenant with death and an agreement with Hell” because it acknowledged the legality of slavery.

African Americans played a vital role in the abolitionist movement, staging protests against segregated churches,
schools, and public transportation. In New York and Pennsylvania, free blacks launched petition drives for equal
voting rights. Northern blacks also had a pivotal role in the “underground railroad,” which provided escape routes for
southern slaves through the northern states and into Canada. African-American churches offered sanctuary to
runaways, and black “vigilance” groups in cities such as New York and Detroit battled slave catchers who sought to
recapture fugitive slaves.

Fugitive slaves, such as William Wells Brown, Henry Bibb, and Harriet Tubman, advanced abolitionism by publicizing
the horrors of slavery. Their firsthand tales of whippings and separation from spouses and children combated the
notion that slaves were contented under slavery and undermined belief in racial inferiority. Tubman risked her life by
making 19 trips into slave territory to free as many as 300 slaves. Slaveholders posted a reward of $40,000 for the
capture of the “Black Moses.”

Frederick Douglass was the most famous fugitive slave and black abolitionist. The son of a Maryland slave woman and
an unknown white father, Douglass was separated from his mother and sent to work on a plantation when he was 6
years old. At the age of 20, in 1838, he escaped to the North using the papers of a free black sailor. In the North,
Douglass became the first runaway slave to speak out against slavery. When many Northerners refused to believe
that this eloquent orator could possibly have been a slave, he responded by writing an autobiography that identified
his previous owners by name. Although he initially allied himself with William Lloyd Garrison, Douglass later started
his own newspaper, The North Star, and supported political action against slavery.

By the 1850s, many blacks had become pessimistic about defeating slavery. Some African Americans looked again to
colonization as a solution. In the 15 months following passage of the federal Fugitive Slave Law in 1850, some
13,000 free blacks fled the North for Canada. In 1854, Martin Delany (1812–1885), a Pittsburgh doctor who had
studied medicine at Harvard, organized the National Emigration Convention to investigate possible sites for black
colonization in Haiti, Central America, and West Africa.

Other blacks argued in favor of violence. Black abolitionists in Ohio adopted resolutions encouraging slaves to escape
and called on their fellow citizens to violate any law that “conflicts with reason, liberty and justice, North or South.” A
meeting of fugitive slaves in Cazenovia, New York, declared that “the State motto of Virginia, ‘Death to Tyrants,’ is as
well the black man’s as the white man’s motto.” By the late 1850s, a growing number of free blacks had concluded
that it was just as legitimate to use violence to secure the freedom of the slaves as it had been to establish the
independence of the American colonies.

Over the long run, the fragmentation of the antislavery movement worked to the advantage of the cause. Henceforth,
Northerners could support whichever form of antislavery best reflected their views. Moderates could vote for political
candidates with abolitionist sentiments without being accused of radical Garrisonian views or of advocating violence
for redress of grievances.
Page 33

The Fugitive Slave Law

The most explosive element in the Compromise of 1850 was the Fugitive Slave Law, which required the return of
runaway slaves. Any black--even free blacks--could be sent south solely on the affidavit of anyone claiming to be his
or her owner. The law stripped runaway slaves of such basic legal rights as the right to a jury trial and the right to
testify in one's own defense.

Under the Fugitive Slave Law, an accused runaway was to stand trial in front of a special commissioner, not a judge
or a jury, and that the commissioner was to be paid $10 if a fugitive was returned to slavery but only $5 if the
fugitive was freed. Many Northerners regarded this provision as a bribe to ensure that any black accused of being a
runaway would be found guilty. Finally, the law required all U.S. citizens and U.S. marshals to assist in the capture of
escapees. Anyone who refused to aid in the capture of a fugitive, interfered with the arrest of a slave, or tried to free
a slave already in custody was subject to a heavy fine and imprisonment.

The Fugitive Slave Law produced widespread outrage in the North and convinced thousands of Northerners that
slavery should be barred from the western territories.

Attempts to enforce the Fugitive Slave Law provoked wholesale opposition. Eight northern states enacted "personal
liberty" laws that prohibited state officials from assisting in the return of runaways and extended the right of jury trial
to fugitives. Southerners regarded these attempts to obstruct the return of runaways as a violation of the Constitution
and federal law.

The free black communities of the North responded defiantly to the 1850 law. They provided fugitive slaves with
sanctuary and established vigilance committees to protect blacks from hired kidnappers who were searching the
North for runaways. Some 15,000 free blacks emigrated to Canada, Haiti, the British Caribbean, and Africa after the
adoption of the 1850 federal law.

The South's demand for an effective fugitive slave law was a major source of sectional tension. In Christiana,
Pennsylvania, in 1851, a gun battle broke out between abolitionists and slave catchers, and in Wisconsin,
abolitionists freed a fugitive named Joshua Glover from a local jail. In Boston, federal marshals and 22 companies of
state troops were needed to prevent a crowd from storming a court house to free a fugitive named Anthony Burns.
Page 34

A New Birth of Freedom: The Day of Jubilee

The North's victory in the Civil War produced a social revolution in the South. Four million slaves were freed and a
quarter million southern whites had died, one fifth of the male population. $2.5 billion worth of property had been
lost.

Slave emancipation did not come in a single moment. In coastal South Carolina and parts of Louisiana and Florida,
some slaves gained their freedom as early as the fall of 1861, when Union generals like John C. Fremont, without
presidential or Congressional authorization, proclaimed slaves in their conquered districts to be free. During the war,
slaves, by the tens of thousands, abandoned their plantations and flocked to Union lines. Black soldiers in the Union
Army and their families automatically gained freedom.

Many slaves in Texas did not formally hear about freedom until June 19, 1865, which is why "Juneteenth" continues
to be celebrated as emancipation day throughout the Southwest. Many slaves in the border states that remained in
the Union--Delaware, Kentucky, Maryland, and Missouri--were not freed until December 1865, eight months after the
end of the Civil War, when the 13th Amendment, abolishing slavery, was ratified.

For former slaves, emancipation was a moment of exhilaration, fear, and uncertainty. While some greeted the
announcement of freedom with jubilation, others reacted with stunned silence. Responses to the news of
emancipation ranged from exhilaration and celebration to incredulity and fear.

In Choctaw County, Mississippi, former slaves whipped a planter named Nat Best to retaliate for his cruelties. In
Richmond, Virginia, some 1,500 ex-slaves gathered in the Free African church to sing hymns. A parade in Charleston
attracted 10,000 spectators and featured a black-draped coffin bearing the words "Slavery is Dead." A northern
journalist met an ex-slave in North Carolina who had walked 600 miles searching for his wife and children who had
been sold four years before. But many freed men and women felt a deep uncertainty about their status and rights.

Ex-slaves expressed their newly-won freedom in diverse ways. Many couples, forbidden to marry during slavery,
took the opportunity to formalize their unions. Others, who had lived apart from their families on separate
plantations, were finally free to reside with their spouses and children. As an expression of their freedom, many
freedmen dropped their slave names, adopted new surnames, and insisted on being addressed as "mister" or
misses." Many black women withdrew from field labor to care for their families.

Many ex-slaves left farms or plantations for towns or cities "where freedom was free-er." Across the South, former
slaves left white-dominated churches and formed independent black congregations; founded schools; and set up
mutual aid societies. They also held freedmen's conventions to air grievances, discuss pressing issues, and press for
equal civil and political rights.

Shocked at seeing former slaves transformed into free women and men, many southern whites complained of
"betrayal" and "ingratitude" when freedmen left their plantations. Revealing slaveowners' capacity for self-deception,
one former master complained that "those we loved best, and who loved us best--as we thought--were the first to
leave us."

In many parts of the South, the end of the war was followed by outbursts of white rage. White mobs whipped,
clubbed, and murdered ex-slaves. In contrast, the vast majority of former slaves refrained from vengeance against
former masters. Instead they struggled to achieve social and independence by forming separate lodges, newspapers,
and political organizations.

Reconstruction was a time of testing, when freedmen probed the boundaries and possibilities of freedom. Every
aspect of southern life was subject to redefinition, from forms of racial etiquette to the systems of labor that would
replace slavery.
Page 35

The founding fathers set a high standard of ideals for the new nation to live up to back
in 1776. But from the very beginning, debate about the best way to do that has been an
inherent part of the American experiment. Since its founding, the United States has had
both high and low moments on its road to ensuring freedom and equality for its citizens.
Take a look back at eight moments in history when the nation made strides toward
ensuring life, liberty and the pursuit of happiness—for all.

The Declaration of Independence

The signing of the Declaration of Independence. (Credit: GraphicaArtis/Getty Images)

More than a year after fighting broke out between colonial militia and British forces in
April 1775, the Continental Congress in Philadelphia finally decided to declare the
independence of the North American colonies. The main goal of the Declaration of
Independence, adopted on July 4, 1776, was to present the colonists’ grievances against
Great Britain, but it would be Thomas Jefferson’s introductory words (“We hold these
truths to be self-evident; that all men are created equal; that they are endowed by their
Creator with certain inalienable rights…”) that would echo most strongly through
generations to come.

The Bill of Rights


Page 36

VIDEO: The U.S. Constitution After several failed attempts at creating a government, a 1787 convention is called
to draft a new legal system for the United States. This new Constitution provides for increased federal authority
while still protecting the basic rights of its citizens.

In the earliest years of the new nation, many people opposed the Constitution because
they thought it gave the federal government too much power over its people. As soon as
the new U.S. Congress met, it began debating a number of constitutional amendments,
the first 10 of which were ratified in December 1791 as the Bill of Rights. By
guaranteeing certain fundamental rights—including freedom of speech and religion, the
right to bear arms and the right to a fair trial—against infringement by the federal
government, the Bill of Rights greatly expanded the civil liberties of Americans, with
implications that are still being debated today.

The Abolition of Slavery

An illustration depicting varying depictions of African American life before and after the Emancipation
Proclamation. (Credit: Library of Congress/Corbis/Getty Images)
Page 37

By 1862, President Abraham Lincoln had become convinced that freeing the South’s


slaves was critical to the Union effort to win the Civil War. Though the Emancipation
Proclamation, which took effect the following year, applied only to the slaves in
Confederate states, Lincoln made it clear in his historic Gettysburg Address that the
Union now fought to provide a “new birth of freedom” rather than simply bring the South
back into the fold. Passage of the 13th Amendment to the Constitution in 1865 abolished
the institution of slavery, and granted liberty to more than 4 million black men, women
and children formerly held in bondage.

‘Yearning to Breathe Free’— The Era of Immigration

An immigrant family on the dock at Ellis Island, having just passed the examination for entry into the United
States. (Credit: Bettmann Archive/Getty Images)

“Give me your tired, your poor/Your huddled masses yearning to breathe free,” the poet
Emma Lazarus imagined the Statue of Liberty saying to the world in her famous sonnet
“The New Colossus.” From 1880 to 1920, more than 20 million immigrants came to the
United States seeking freedom and new opportunity. Whether they were fleeing religious
persecution (Eastern European Jews), hunger and poverty (Italians), or war or revolution
at home (Armenia and Mexico), the United States welcomed these new arrivals—with
the notable exception of people from Asian countries, whose entrance was strictly limited
by laws such as the Chinese Exclusion Act of 1882. This relatively open-door policy
ended with the onset of World War I, and in the 1920s a series of new laws would be
introduced to limit immigration.

The 19th Amendment


Page 38

Women casting their first votes for president, from New York City, 1920. (Credit: Bettmann Archive/Getty Images)

Some 72 years after the national women’s rights movement launched at Seneca Falls,
ratification of the 19th Amendment in 1920 finally gave women the right to vote. Despite
setbacks and internal divisions in the decades after the Civil War, the suffrage
movement gained momentum in the early 20th century, as protesters were arrested,
imprisoned and in some cases went on hunger strikes for the cause. After Tennessee
became the last necessary state to ratify the 19th Amendment in August 1920, women
across the country headed to the polls to exercise their long-awaited right to cast their
ballots in the presidential election that fall.

D-Day

Colorized photo “Into the Jaws of Death,” photographed by Robert F Sargent of the United States Army First
Infantry Division disembarking from a landing craft onto Omaha Beach during the Normandy Landings on D Day,
June 6, 1944. (Credit: Smith Collection/Gado/Getty Images).

“People of western Europe…the hour of your liberation is approaching,” General Dwight


D. Eisenhower, supreme commander of the Allied Expeditionary Force, announced in a
speech broadcast via radio on June 6, 1944. By the end of that day, some 156,000
Page 39

American, British and Canadians forces had landed simultaneously on five beachheads
in northern France, beginning the Allied invasion of Western Europe during World War II.
As Eisenhower’s speech had predicted, the triumphant landing marked the beginning of
the end for Adolf Hitler’s Nazi forces, which would surrender unconditionally less than a
year later.

The Civil Rights Act of 1964

VIDEO: Civil Rights Act After years of struggle and setbacks, advocates for equality celebrate the passage of
sweeping legislation that prohibits racial discrimination.

In 1963, as civil rights activists protesting segregation and voting restriction across the
South met with violent opposition, and hundreds of thousands of people marched on
Washington to demand “Jobs and Freedom,” President John F. Kennedy introduced the
first major civil rights legislation since Reconstruction. After JFK’s assassination that
November, his successor Lyndon B. Johnson took up the cause, doggedly pushing the
bill through stiff Democratic opposition in Congress. On June 2, 1964, Johnson signed
into law the Civil Rights Act, which ended the segregation of public and many private
facilities, and outlawed discrimination based on race, color, religion, sex or national
origin.

Freedom to Marry
Page 40

LGBTQ activists react to the decision recognizing same sex marriage as a civil right. (Credit: David Greedy/Getty
Images)

On June 26, 2015, the Supreme Court issued a landmark ruling declaring that the
Constitution guarantees to same-sex couples the freedom to marry. The case that led to
this milestone achievement for the gay rights movement, Obergefell v. Hodges, began
when same-sex couples sued in Ohio, Michigan, Kentucky and Tennessee, declaring
that their states’ bans on gay marriage were unconstitutional. In a decision that echoed
the Court’s 1967 verdict in Loving v. Virginia, which struck down state laws banning
interracial marriage, Justice Anthony Kennedy declared that the freedom to marry was
one of the most fundamental liberties guaranteed to individuals under the 14th
Amendment, and should apply to same-sex couples just as it does to heterosexual
couples. “They ask for equal dignity in the eyes of the law,” Kennedy wrote. “The
Constitution grants them that right.”  
Page 41

Native Americans - in Brief

Who are the Native Americans? Thanks to Disney


you probably remember Hiawatha, who was
portrayed as a cute guy with a feather in his hair?
You have probably not forgotten about the Disney
version of the beautiful Pocahontas either?
Hiawatha and Pochahontas are both Native
American characters.

Tribal People

When the Europeans started to settle in North America


in the 1500s, the first natives they met were a tribal
people. The most famous tribes were the Sioux, the
Navajo, the Apaches and the Cherokees. It is difficult to
give exact numbers, but it is estimated that the native
population was about 1.5 million. They represented
different tribes, each with its own language and culture.
Hiawatha probably was a chief from a tribe called the
Iroquois and Pocahontas was the daughter of a
Powhatan chief.

Indians
When Christopher Columbus in 1492 set sail from a
Spanish harbor, he was heading for India. Two months
later he spotted land and thought he was in India, when
actually, he had landed on an American island.

Consequently, he thought that the first people he met


were Indians. Due to this mistake, the Native Americans have been called Indians for 500 years.
Columbus had to report back to the Spanish queen. This is what he wrote about the natives: “They
ought to make good and skilled servants, for they repeat very quickly whatever we say to them. I
think they can very easily be made Christians, for they seem to have no religion.”
Page 42

Clash of Interests
To start with the relations between the American natives and the first settlers of North America were friendly. This
gradually changed. Throughout the 1800s new settlers arrived every day and they needed more space. The tribes had
never thought of the land as belonging to them. To them, it was the Great Spirit that had created the land for them to
use. On the prairie they hunted the buffalo with their bows and arrows and every part of the animal was used. The white
man, however, brought with them modern weapons and wanted to have their own parcels of farmland. Not only did the
white man bring with them modern ways of living, they also introduced the natives to dangerous diseases that they had
no immunity against. One of the diseases that claimed many victims was smallpox.

From Bad to Worse


After the Indian Removal Act was passed in 1830, five tribes (100,000 people in all) were forced to move from their
hunting grounds and homelands. One of these removals is known as the “Trail of Tears”. When gold was discovered in
an area in Georgia inhabited by Cherokees, the tribe was forced to move in 1839. It was freezing winter, many of the
Native Americans had to walk barefoot and 4,000 (among them many children) did not survive the long trail to their
destination in Oklahoma, where the authorities had set aside an area that they called Indian Territory. This was

the first reservation that was established. There were more to come….

In 1860 The Homestead Act was passed giving settlers 160 acres of farmland for free. In this way the authorities wanted to
encourage Europeans to establish farms on the former tribal hunting grounds, the prairie. Many of the tribes had been
forced to move to Oklahoma in the years before this. This was a definite blow for the natives and their favorite animal, the
buffalo.

Life on the Reservations


Starting with Oklahoma, reservations all over the North American continent were created to control the Native Americans.
The land that they were given by the authorities was infertile and they had to give up their original way of living. This led
them into poverty and a sad side effect was that many of the once proud Indians became alcoholics and drug addicts.
Many of the children were taken away from their parents and taught how to be “good” Americans, which basically meant to
behave like the whites. In this way the Native Americans lost important parts of their culture and the number of North
American Indians decreased from 1.5 million in the 1500s to 350,000 (1920).

The Modern Native American


A new act in 1924 finally gave the Native Americans full American citizenship and therefore more rights. Money was given
in compensation for the land that had been given to the white man. Some of the investments the Native Americans made
were in the gambling business. Many reservations
Page 43

established casinos which became an important source of income.

The Current Situation


Today there are about 2.8 million Native Americans, or American Indians, as some prefer to call themselves. They are
represented by 562 different tribes living in cities or on the 300 reservations in the United States. Statistics show that the
Native Americans are among the poorest citizens in the USA. However, several Native Americans take pride in their
culture and history. Maybe we should all learn a lesson from the Native Americans and how they respected and
worshipped Nature.
Page 44

Amending the Constitution

It is a measure of the success of the Constitution's drafters that after the adoption in 1791 of the ten amendments
that constitute the Bill of Rights, the original document has been changed only 17 times.

Only six of those amendments have dealt with the structure of government. With the exception of Prohibition and its
revocation, the main thrust of the other amendments has been to protect or expand the rights already guaranteed in
the Constitution and the Bill of Rights.

Over the years, there have been many proposals to alter the Constitution. These include an 1808 proposal by a
Connecticut Senator that the nation choose its president through an annual random drawing from a list of retiring
senators to a 1923 proposal for an amendment to guarantee equal rights for women.

If the Constitution has rarely been amended, it is in no small part because its authors made it difficult to tamper
with. Amendments must follow one of two routes. Under the one followed by all amendments to date, two-thirds
majorities of each house of Congress vote their approval and three quarters of the state legislatures add their
ratification. Under the second route, two thirds of the states may vote to call a constitutional convention, whose
proposed amendments must be ratified by three-fourths of the state legislatures.

The first ten amendments were added in 1791 and later amendments introduced such far-reaching changes as
ending slavery, creating national guarantees of due process and individual rights, granting women the vote, and
providing for direct popular election of senators.

In 1793, the Supreme Court angered states by accepting jurisdiction in a case where an individual sued the state of
Georgia. To ensure that did not happen again, Congress and the states added the 11th Amendment in 1798.

The 12th Amendment, ratified in 1804, had electors vote separately for president and vice president. Until then,
the candidate with the most Electoral College votes became president, and the runner up, vice president.

Slavery generated four amendments. The 13th Amendment, ratified in 1865, abolished slavery. The 14th
Amendment was adopted in 1868 to protect the civil rights of former slaves. It granted citizenship to all people
born in the United States. Two years later, the 15th Amendment declared that the right to vote shall not be
abridged on account of race or previous condition of servitude.

The 16th Amendment (1913) authorized an income tax, which the Supreme Court had declared unconstitutional in
1895.

The 17th Amendment required direct election of senators.

In 1919, the states approved the 18th Amendment, prohibiting the manufacture and sale of alcoholic beverages. In
1933, Congress proposed an amendment to repeal Prohibition. The 21st Amendment was ratified in just 286 days.

The 19th Amendment extended the vote to women.

The 20th Amendment reduced the time between the election of national officials and their assumption of office.

The 22nd Amendment, adopted in 1951, limited presidents to two terms.

The 23rd Amendment, enacted in 1961, allowed residents of the District of Columbia to vote in presidential
elections.

The 24th Amendment, ratified in 1964, prohibited a poll tax in federal elections.

The 25th Amendment (1967) provided a system for selecting a new vice president after the death or resignation of
a president. It also established a system to deal with the possibility that a president might become disabled.

The 26th Amendment, adopted in 1971, extended the vote to 18 year-olds.

The 27th Amendment, ratified in 1992, prevents Congress from giving itself an immediate pay increase. It says
that a change in pay can only go into effect after the next congressional election
Page 45
American
constitution
HISTORICAL BACKGROUND

To understand any country's political system, it is helpful to know something of the history of the nation
and the background to the creation of the (latest) constitution. But this is a fundamental neccesity in the
case of the American political system. This is because the Constitution of the United States is so different
from those of other nations and because that Constitution is, in all material respects, the same document
as it was over two centuries ago.

There were four main factors in the minds of the 'founding fathers' who drafted the US Constitution:

1. The United States had just fought and won a bloody War of Independence from Britain
and it was determined to create a political system that was totally different from the
British system in which considerable authority still resided in a hereditary King (George III
at the time) or Queen and in which Parliament was increasingly assertive in the exercise
of its growing powers. Therefore the new constitution deliberately spread power between
the three arms of government - executive, legislature and judiciary - and ensured that
each arm was able to limit the exercise of power by the other arms.

2. The United States was already a large country with problems of communications and a
population of varied background and education. Therefore, for all the intentions to be a
new democracy, it was seen as important to limit the influence of swings in public opinion.
So the election of the president was placed in the hands of an Electoral College, rather
than the subject of direct election, and the terms of office of the president and the two
chambers of the legislature were all set at different lengths.

3. The United States was the creation of 13 individual states, each of which valued its
traditions and powers, and so the overarching federal government was deliberately limited
in its powers compared to the position of the central government in other nations.
Arguably the later Civil War was about states' rights more than it was about slavery and
there is still a real tension today between the states and federal government.

4. The original 13 states of the USA were of very different size in terms of population and
from the beginning there was a determination by the smaller states that political power
should not be excessively in the hands of the larger states. Therefore the Constitution is
built on a 'Great Compromise' between the Virginia plan (representation by population)
and the New Jersey plan (equal representation for all states) which resulted in the House
of Representatives being constructed on the basis of population and the Senate being
composed of an equal number of representatives regardless of population. This is why
today six states have only one member in the House of Representatives but two members
in the Senate.

Whatever the 'founding fathers' intended, the sheer longevity of the Constitution and the
profound changes in America since its drafting means that today the balance of power is not
necessarily what the drafters of the Constitution had in mind. So originally the legislature was
seen as the most powerful arm of government (it is described first in the Constitution) but,
over time, both the Presidency (starting with the time of Abraham Lincoln and the Civil War)
and the Supreme Court (especially on social issues like desegregation, marriage and abortion)
have assumed more power.
Page 46

THE CONSTITUTION

Unlike Britain but like most nation states, the American political system is clearly defined by
basic documents. The Declaration of Independence of 1776 and the Constitution of 1789 form
the foundations of the United States federal government. The Declaration of Independence
establishes the United States as an independent political entity, while the Constitution creates
the basic structure of the federal government. Both documents are on display in the National
Archives and Records Administration Building in Washington, D.C. which I have visited several
times. Further information on the thinking expressed in the Constitution can be found in the
Federalist Papers which are a series of 85 articles and essays published in 1787-1788
promoting the ratification of the Constitution.

The United States Constitution is both the longest-lasting in the world, being over two
centuries old, and one of the the shortest in the world, having just seven articles and 27
amendments (the constitutions of Jordan, Libya and Iceland are the shortest in the world
running to a mere 2,000-4,000 words).

As well as its age and brevity, the US Constitution is notable for being a remarkably stable
document. The first 10 amendments were all carried in 1789 - the same year as the original
constitution - and are collectively known as the Bill of Rights. If one accepts that these first 10
amendments were in effect part of the original constitutional settlement, there have only been
17 amendments in almost 230 years. In fact, famously the 27th Amendment took over 200
years to achieve ratification, having been originally proposed at the same time as the 10 that
make up the Bill of Rights but having only reached ratification in 1992. The last new and
substantive amendment - reduction of the voting age to 18 - was in 1971, almost half a
century ago.

One of the major reasons for this relative immutability is that - quite deliberately on the part of
its drafters - the Constitution is a very difficult instrument to change. First, a proposed
amendment has to secure a two-thirds vote of members present in both houses of Congress.
Then three-quarters of the state legislatures have to ratifiy the proposed change (this stage
may or may not be governed by a specific time limit).

As an indication of how challenging this process is, consider the case of the Equal Rights
Amendment (ERA). This was first written in 1920, shortly after women were given the vote in
the USA. The proposed amendment was introduced in Congress unsuccessfully in every
legislative year from 1923 until it was finally passed in 1972. It was then sent to each state for
ratification but, by 1982, it was still three states short of the minimum of the 38 needed to add
it to the constitution. Various attempts since 1982 to revive the amendment have all failed.

At the heart of the US Constitution is the principle known as 'separation of powers', a term
coined by the French political, enlightenment thinker Montesquieu. This means that power is
spread between three institutions of the state - the executive (President & Cabinet), the
legislature (House of Representatives & Senate) and the judiciary (Supreme Court & federal
circuits) - and no one institution has too much power and no individual can be a member of
more than one institution.
Page 47

This principle is also known as 'checks and balances', since each of the three branches of the
state has some authority to act on its own, some authority to regulate the other two branches,
and has some of its own authority, in turn, regulated by the other branches.
Not only is power spread between the different branches; the members of those branches are
deliberately granted by the Constitution different terms of office which is a further brake on
rapid political change. So the President has a term of four years, while members of the Senate
serve for six years and members of the House of Representatives serve for two years.
Members of the Supreme Court effectively serve for life.

The great benefit of this system is that power is spread and counter-balanced and the
'founding fathers' - the 55 delegates who drafted the Constitution - clearly wished to create a
political system which was in sharp contrast to, and much more democratic than, the
monarchical system then in force in Britain. The great weakness of the system is that it makes
government slow, complicated and legalistic which is a particular disadvantage in a world -
unlike that of 1776 - in which political and economic developments are fast-moving and the
USA is a - indeed the - super power.

Since the Constitution is so short, so old and so difficult to change, for it to be meaningful to
contemporary society it requires interpretation by the courts and ultimately it is the Supreme
Court which determines what the Constitution means. There are very different approaches to
the interpretation of the Constitution with the two main strands of thought being known
as originalism and the Living Constitution.

Originalism is a principle of interpretation that tries to discover the original meaning or intent
of the constitution. It is based on the principle that the judiciary is not supposed to create,
amend or repeal laws (which is the realm of the legislative branch) but only to uphold them.
This approach tends to be supported by conservatives.

Living Constitution is a concept which claims that the Constitution has a dynamic meaning and
that contemporary society should be taken into account when interpreting key constitutional
phrases. Instead of seeking to divine the views of the drafters of the document, it claims that
they deliberately wrote the Constitution in broad terms so that it would remain flexible. This
approach tends to be supported by liberals.
Page 48

THE HOUSE OF REPRESENTATIVES

What is the House of Representatives?

The House of Representatives is the lower chamber in the bicameral legislature known
collectively as Congress. The founders of the United States intended the House to be the
politically dominant entity in the federal system and, in the late 18th and early 19th centuries,
the House served as the primary forum for political debate. However, subsequently the Senate
has been the dominant body.

Who is eligible to become a member of the House?

To be a member of the House, one has to:


be at least 25 years old
have been a US citizen for at least seven years
live in the state which one represents (but not the actual district)

How is a member of the House chosen?

The House consists of 435 members (set in 1911), each of whom represents a congressional
district and serves for a two-year term. House seats are apportioned among the states by
population according to each decennial (every 10 years) census, but every state must have at
least one member and in fact seven states have only one Representative each (Alaska,
Delaware, Montana, North Dakota, South Dakota, Vermont and Wyoming). Typically a House
constituency would represent around 700,000 people.

Once House seats are reapportioned to the states, it is state legislatures that must redraw the
physical boundaries of Congressional districts. Although the states are bound by limits
established by Congress and the Supreme Court, there is scope for gerry-mandering to ensure
electoral advantage for the dominant political party in the state. Such reapportionment of
members of the House takes effect three years after the decennial census so, as the next
census will take place in 2020, reapportionment will take effect for the 118th Congress (2023-
2025).

Members of the House are elected by first-past-the-post voting in every state except Louisiana
and Washington, which have run-offs if no candidate secures more than 50% of the vote.
Elections are always held on the first Tuesday after the first Monday in November in even
numbered years. Voting in congressional elections - especially to the House - is generally much
lower than levels in other liberal democracies. In a year when there is a Presidential election,
turnout is typically around 50%; in years when there is no Presidential election (known as mid-
terms), it usually falls to around one third of the electorate.

In the event that a member of the House of Representatives dies or resigns before the end of
the two-year term, a special election is held to fill the vacancy.

The House has five non-voting delegates from the District of Columbia (1971), Guam (1972)
the Virgin Islands (1976), American Samoa (1981) and the Northern Mariana Islands (2008)
and one resident commissioner for Puerto Rico (1976), bringing the total formal membership to
441. Non-voting delegates are not allowed floor votes, but can vote in any committees to
which they are assigned.
Page 49

What are the powers of the House?


The House of Representatives is one of the two chambers that can initiate and pass
legislation, although to become law any legislation has to be approved by the Senate as
well.

Each chamber of Congress has particular exclusive powers. The House must introduce any
bills for the purpose of raising revenue.

If the Electoral College is tied, the choice of President is made by the House of
Representatives.

The House has a key role in any impeachment proceedings against the President or Vice-
President. It lays the charges which are then passed to the Senate for a trial.

The House (and the Senate) have the power to declare war - although the last time this
happened was in 1941.

Other interesting facts about the House


The Speaker of the House - chosen by the majority party - has considerable power. He or
she presides over the House and sets the agenda, assigns legislation to committees, and
determines whether and how a bill reaches the floor of the chamber.

Currently the Majority Leader in the House - and therefore the Speaker - is the Republican
Paul Ryan, while the Minority leader is Democrat Nancy Pelosi.

Much of the work of the House is done through 20 standing committees and around 100
sub-committees which perform both legislative functions (drafting Bills) and investigatory
functions (holding enquiries). Most of the committees are focused on an area of
government activity such as homeland security, foreign affairs, agriculture, energy, or
transport, but others are more cross-cutting such as those on the budget and ethics.

Activity in the House of Representatives tends to be more partisan than in the Senate.
One illustration of this is the so-called Hastert Rule. This Rule's introduction is widely
credited to former Speaker Dennis Hastert (1999-2007); however, Newt Gingrich, who
directly preceded Hastert as Speaker (1995-1999), followed the same rule. The Hastert
Rule, also known as the "majority of the majority" rule, is an informal governing principle
used by Republican Speakers of the House of Representatives since the mid-1990s to
maintain their speakerships and limit the power of the minority party to bring bills up for a
vote on the floor of the House. Under the doctrine, the Speaker of the House will not allow
a floor vote on a bill unless a majority of the majority party supports the bill. The rule
keeps the minority party from passing bills with the assistance of a small number of
majority party members.

The House of Representatives has met in its chamber in the south wing of the Capitol in
Washington DC since 1857.

Offices of members of the House are located in three buildings on the south side of the
Capitol along Independence Avenue: the Cannon, Longworth, and Rayburn Buildings.

The House and Senate are often referred to by the media as Capitol Hill or simply the
Capitol or the Hill.
Page 50

American Jews

Twenty-three refugees from Portuguese Brazil who arrived in New Amsterdam in the summer of 1654 were the first
Jews to settle in the American colonies. At the time of the Revolution, the American Jewish population numbered no
more than 1,500. There were no more than five or six Jewish congregations in the colonies, no Jewish newspapers,
and not a single rabbi.

During the early 19th century, the Jewish population remained small. By 1812, New York City had the new nation's
largest Jewish population--just 50 families. In 1816, the first organized Protestant efforts to convert Jews to
Christianity began. These efforts sensitized American Jews to their distinctive identity and encouraged Jewish
communities to establish their own schools, hospitals, and synagogues and appoint foreign rabbis as religious
leaders.

By 1850, migration from central and western Europe increased the Jewish population from approximately 2,000 to
50,000. Thousands of immigrants from Germany, Poland, and Hungary in the 1850s tripled the size of the Jewish
population to 150,000 in 1860.

A major challenge American Jews were confronted with was adapting religious orthodoxy to the realities of American
life. Most early 19th century Jews lived in small towns where it was impossible to obey traditional laws--towns lacked
synagogues, a mikvah (ritual bath), a ritual circumciser, and a kosher butcher. Many Jews also found it impossible to
refrain from working on the Jewish Sabbath, Saturday.

As early as 1824, a group of Jews in Charleston, S.C., organized one of the country's first Reformed congregations.
Their aim was to modify "such parts of the prevailing system of Worship, as are inconsistent with the present
enlightened state of society, and not in accordance with the Five Books of Moses and the Prophets." Contrary to
Orthodox practice, they worshiped from an English language prayer book, with their heads uncovered, while listening
to instrumental music. In later years, many other congregations "Americanized" their rituals by playing organ music
during services, permitting men and women to worship side by side, allowing men to prayer without the traditional
prayer shawl and head covering, and establishing confirmation ceremonies for boys and girls.

American Jews avidly formed community and charitable institutions. Even small towns that lacked Jewish
congregations had a B'nai B'rith, a lodge and benevolent society founded in 1843, or a Young Men's Hebrew
Association (the first was formed in 1854), as well as separate orphan asylums and burial societies.

Jews experienced less discrimination and persecution than Catholics or Mormons in part because of their small
numbers and in part because the Jewish community was scattered and decentralized--and therefore did not provoke
fears of conspiracy. Equally important, Jews shed distinctive dress and shaved long sideburns that set European Jews
apart. But Jews vigorously resisted threats to their identity, strongly opposing state laws that limited membership in
state legislatures to Christians and that banned commerce on the Christian Sabbath, as well as efforts of Christian
missionaries to convert them and the recitation of Christian prayers in public schools. Pre-Civil War Jews engaged in
an uneasy balancing act: they struggled to shed the appearance of foreignness and modernize Jewish traditions while
sustaining Jewish distinctiveness.
Page 51

THE PRESIDENCY

What is the Presidency?

The President is the head of the executive branch of the federal government of the United
States. He - so far, the position has always been held by a man - is both the head of state and
the head of government, as well as the military commander-in-chief and chief diplomat.

The President presides over the executive branch of the government, a vast organisation
numbering about four million people, including one million active-duty military personnel. The
so-called Hatch Act of 1939 forbids anyone in the executive branch - except the President or
Vice-President - from using his or her official position to engage in political activity.

Who is eligible to become a President?

To be President, one has to:


be a natural-born citizen of the United States
be at least 35 years old
have lived in the US for at least 14 years

How is a President chosen?

The President is elected for a fixed term of four years and may serve a maximum of two terms.
Originally there was no constitutional limit on the number of terms that a President could serve
in office and the first President George Washington set the precedent of serving simply two
terms. Following the election of Franklin D Roosevelt to a record four terms, it was decided to
limit terms to two and the relevant constitutional change - the 22nd Amendment - was enacted
in 1951.

Elections are always held on the first Tuesday after the first Monday in November to coincide
with Congressional elections. So the last election was held on 8 November 2016 and the next
eelction will be held on 3 November 2020.

The President is not elected directly by the voters but by an Electoral College representing each
state on the basis of a combination of the number of members in the Senate (two for each
state regardless of size) and the number of members in the House of Representatives (roughly
proportional to population). The states with the largest number of votes are California (55),
Texas (38) and New York (29). The states with the smallest number of votes - there are seven
of them - have only three votes. The District of Columbia, which has no voting representation
in Congress, has three Electoral College votes. In effect, therefore, the Presidential election is
not one election but 51.

The total Electoral College vote is 538. This means that, to become President, a candidate has
to win at least 270 electoral votes. The voting system awards the Electoral College votes from
each state to delegates committed to vote for a certain candidate in a "winner take all" system,
with the exception of Maine and Nebraska (which award their Electoral College votes according
to Congressional Districts rather than for the state as a whole). In practice, most states are
firmly Democrat - for instance, California and New York - or firmly Republican - for instance,
Texas and Tennessee. Therefore, candidates concentrate their appearances and resources on
the so-called "battleground states", those that might go to either party. The three largest
battleground or swing states are Florida (29 votes), Pennsylvania (20) and Ohio (18). Others
Page 52

include North Carolina (15), Virginia (13), Wisconsin (10), Colorado (9), Iowa (6) and Nevada
(6).

This system of election means that a candidate can win the largest number of votes nationwide
but fail to win the largest number of votes in the Electoral College and therefore fail to become
President. Indeed, in practice, this has happened four times in US history: 1876, 1888, 2000
and 2016. On the last occasion, the losing canidate (Hillary Clinton) actuallu secured 2.9
million more votes than the winning candidate (Donald Trump). If this seems strange (at least
to non-Americans), the explanation is that the 'founding fathers' who drafted the American
Constitution did not wish to give too much power to the people and so devised a system that
gives the ultimate power of electing the President to members of the Electoral College. The
same Constitution, however, enables each state to determine how its members in the Electoral
College are chosen and since the 1820s states have chosen their electors by a direct vote of
the people. The United States is the only example in the world of an indirectly elected
executive president.

In the event that the Electoral College is evenly divided between two candidates or no
candidate secures a majority of the votes, the constitution provides that the choice of President
is made by the House of Representatives and the choice of Vice-President is made by the
Senate. In the first case, the representatives of each state have to agree collectively on the
allocation of a single vote. In the second case, each senator has one vote. This has actually
happened twice - in 1800 and 1824. In 1800, the House of Representatives, after 35 votes in
which neither Thomas Jefferson nor Aaron Burr obtained a majority, elected Jefferson on the
36th ballot. In 1824, neither John Quincy Adams nor Andrew Jackson was able to secure a
majority of the votes in the Electoral College and the House of Representatives chose Adams
even though he had fewer Electoral Colleage votes and fewer votes at the ballot boxes than
Jackson.

What are the powers of the President?

Within the executive branch, the President has broad constitutional powers to manage
national affairs and the workings of the federal government.

The President may issue executive orders to affect internal policies. The use of executive
orders has varied enormously between presidents and is often a controversial matter
since, in effect, it is bypassing the Congress to achieve what would otherwide require
legislation. Very few such orders were issued until the time of Abraham Lincoln (the
Emanicpation Declaration was such an order); use of executive orders was considerable
and peaked during the terms of the seven presidents from Theodore Roosevelt to Franklin
D Roosevelt (1901-1945); but, since the Second World War, use has been more modest
with Democrats tending to issue them a bit more than Republicans. Barack Obama has
made very sparing use of this power, notably to reform immigration law and to tighten
gun controls. Executive orders can be overturned by a succeeding President.

The President has the power to recommend measures to Congress and may sign or veto
legislation passed by Congress. The Congress may override a presidential veto but only by
a two-thirds majority in each house.

The President has the authority to appoint Cabinet members, Supreme Court justices.
federal judges, and ambassadors but only with the'advice and consent' of the Senate
which can be problematic especially when the Senate is controlled by a different political
party to that of the President.
Page 53

The President has the power to pardon criminals convicted of offences against the federal
government and most controversially President Gerald Ford used this power to pardon his
predecessor Richard Nixon.

The President has the power to make treaties with the 'advice and consent' of the Senate.

The President can declare war for 60 days but then has to have the approval of Congress
(although it can be difficult to withdraw troops once they have been committed).

Since 1939, there has been an Executive Office of the President (EOP) which has consistently
and considerably expanded in size and power. Today it consists of some 1,600 staff and costs
some $300M a year.

Besides the formal powers of the President, there are informal means of exercising influence.
Most notably, Teddy Roosvelt introduced the notion of 'the bully pulpit': the ability of the
President to use his standing to influence public opinion. Over time, the changing nature of
media - newspapers, radio, television, the Internet, social media - has presented a variety of
instruments for the White House to use to 'push' Congress or other political players or indeed
communicate directly with the electorate. Currently Donald Trump uses his personal Twitter
account to issue several messages a day to (as at summer 2017) some 32.4 million followers.
Add to that his POTUS Twitter account (18.8 million followers), Facebook pages (22.4 million
likes and 1.7 million followers), YouTube subscribers (103,000 and 4.3 million), and Instagram
(7 million followers). That is a lot of 'bullying'.

Other interesting facts about the Presidency

Although the 'founding fathers' wanted to avoid a political system that in any way
reflected the monarchical system then prevalent in Britain and for a long time the
Presidency was relatively weak, the vast expansion of the federal bureaucracy and the
military in the 20th century has in current practice given a greater role and more power to
the President than is the case for any single individual in most political systems.

The President may be impeached which means that he is removed from the office. The
House of Representatives has the sole power of impeaching, while the Senate has the sole
power to try all such impeachments. Two U.S. Presidents have been impeached by the
House of Representatives but acquitted at the trials held by the Senate: Andrew Johnson
(1868) and Bill Clinton (1999). Richard Nixon resigned before he would certainly have
been impeached (1974).

The President may be removed from office if a majority of the Vice President and the
principal officers of the executive departments decide that the President is unable to
discharge the powers and duties of his office. In fact, this provision of the Constitution -
the 25th Amendment - has never been invoked.

Although the President heads the executive branch of government, the day-to-day
enforcement and administration of federal laws is in the hands of the various federal
executive departments, created by Congress to deal with specific areas of national and
international affairs. The heads of the 15 departments, chosen by the President and
approved with the 'advice and consent' of the Senate, form a council of advisors generally
known as the President's "Cabinet". This is not a cabinet in the British political sense: it
does not meet so often and does not act so collectively.
Page 54

In fact, the President has powers of patronage that extend way beyond appointment of
Cabinet members. In all, the President appoints roughly 4,000 individuals to positions in
the federal government, of which around 1,200 require the confirmation of the Senate. As
the divisions in American politics have deepened, so the confirmation process has become
more fractious and prolonged - when first elected, Barack Obama had to wait ten months
before all his nominees were in their jobs.

The first United States President was George Washington, who served from 1789-1797, so
that the current President Donald Trump is the 44th to hold the office. However, there
have been 45 presidencies. Grover Cleveland was the 22nd and 24th President and
therefore was the only US president to serve two non-consecutive terms (1885-1889 and
1893-1897) and to be counted twice in the numbering of the presidents.

So far, every US President has been male. All but one President has been Protestant (the
exception was John Kennedy who was a Catholic) and all but one President has been
white (the exception is Barack Obama). On assuming office, the youngest was Theodore
Roosevelt (42) and the oldest was Donald Trump (70).

Four sitting Presidents have been assassinated: Abraham Lincoln in 1865, James A.
Garfield in 1881, William McKinley in 1901, and John F. Kennedy in 1963. A further eight
Presidents were subject to near misses in assassination attempts.

The President is sometimes referred to as POTUS (President Of The United States) and the
Presidency is often referred to by the media as variously the White House, the West Wing,
and the Oval Office.

Such is the respect for the Presidency that, even having left office, a President is referred
to by the title for the remainder of his life.

The position of Vice-President is elected on the same ticket as that of the President and has the
same four-year term of office. The Vice-President is often described as 'a heart beat away from
the Presidency' since, in the event of the death or incapacity of the President, the Vice-
President assumes the office.

In practice, however, a Vice-Presidential candidate is chosen (by the Presidential candidate) to


'balance the ticket' in the Presidential election (that is, represent a different geographical or
gender or ethnic constituency) and, for all practical purposes, the position only carries the
power accorded to it by the President - which is usually very little (a major exception has been
Dick Cheney under George W Bush). The official duties of the Vice-President are to sit as a
member of the "Cabinet" and as a member of the National Security Council and to act as ex-
officio President of the Senate.
Page 55

THE SENATE

What is the Senate?

The Senate is the upper chamber in the bicameral legislature known collectively as Congress.
The original intention of the authors of the US Constitution was that the Senate should be a
regulatory group, less politically dominant than the House. However, since the mid 19th
century, the Senate has been the dominant chamber and indeed today it is perhaps the most
powerful upper house of any legislative body in the world.

Who is eligible to become a member of the Senate?

To be a member of the Senate, one has to:

be at least 30 years old


have been a US citizen for at least nine years
live in the state which one represents

How is a member of the Senate chosen?

The Senate consists of 100 members, each of whom represents a state and serves for a six-
year term (one third of the Senate stands for election every two years).

Each state has two Senators, regardless of population, and, since there are 50 states, then
there are 100 senators. This equality of Senate seats between states has the effect of
producing huge variations in constituency population (the two senators from Wyoming
represent less than half a million electors, while the two senators from California represent
34M people) with gross over-representation of the smaller states and serious under-
representation of racial and ethnic minorities.

For a long time, Senators were elected by the individual state legislatures. However, since the
17th Amendment to the Constitution in 1913, members of the Senate are elected by first-past-
the-post voting in every state except Louisiana and Washington, which have run-offs. Elections
are always held on the first Tuesday after the first Monday in November in even numbered
years.

Each Senator is known as the senior or junior Senator for his or her state, based on length of
service.

In the event that a member of the Senate dies or resigns before the end of the six-year term,
a special election is not normally held at that time (this is the case for 46 states). Instead the
Governor of the state that the Senator represented nominates someone to serve until the next
set of Congressional elections when the special election is held to fill the vacancy.

What are the powers of the Senate?

The Senate is one of the two chambers that can initiate and pass legislation, although to
become law any legislation has to be approved by the House of Representatives as well.

Each chamber of Congress has particular exclusive powers. The Senate must give 'advice
and consent' to many important Presidential appointments including Cabinet members,
Supreme Court justices. federal judges, and ambassadors.
Page 56

The Senate has the responsibility of ratifying treaties.

If the Electoral College is tied, the choice of Vice-President is made by the Senate.

The Senate has a key role in any impeachment proceedings against the President or Vice-
President. Once the House of Representatives has laid the charges, the Senate then
conducts a trial on these charges. The Supreme Court Chief Justice presides over such a
trial. A two-thirds majority of the Senate is required to uphold impeachment charges.

The Senate (and the House) have the power to declare war - although the last time this
happened was in 1941.

Other interesting facts about the Senate

The most powerful position in the Senate is the Majority Leader but he or she does not
have the same control over the upper chamber as the control that the Speaker of the
House has over the lower chamber, since the 'whipping' system is weaker in the Senate.

Currently the Majority Leader in the Senate is the Republican Mitch McConnell, while the
Minority leader is Democrat Chuck Schummer.

Much of the work of the Senate is done through 16 standing committees and around 40
sub-committees which perform both legislative functions (drafting Bills) and investigatory
functions (holding enquiries). Most of the committees are focused on an area of
government activity such as homeland security, foreign relations, health, energy, or
transport, but others are more cross-cutting such as those on the budget and rules.

Activity in the Senate tends to be less partisan and more individualistic than in the House
of Representatives. Senate rules permit what is called a filibuster when a Senator, or a
series of Senators, can speak for as long as they wish and on any topic they choose,
unless a supermajority of three-fifths of the Senate (60 Senators, if all 100 seats are
filled) brings debate to a close by invoking what is called cloture (taken from the French
term for closure).

The Senate has met in its chamber in the north wing of the Capitol in Washington DC
since 1859.

Offices of members of the Senate are located in three buildings on the north side of the
Capitol along Constitution Avenue: the Russell, Dirksen, and Hart Buildings.

The Senate and House are often referred to by the media as Capitol Hill or simply the
Capitol or the Hill.
Page 57

NASA Technologies Benefit Our Lives


Trace Space Back to You!

Have you ever wondered how space exploration impacts your daily life?

Space exploration has created new markets and new technologies that have spurred our economy and
changed our lives in many ways. This year, NASA unveiled two new complementary interactive Web
features, NASA City and NASA @ Home, available at www.nasa.gov/city. The new features highlight
how space pervades our lives, invisible yet critical to so many aspects of our daily activities and well-
being.

Health and Medicine

Light-Emitting Diodes (LEDs)

Red light-emitting diodes are growing plants in space and healing humans on Earth.
The LED technology used in NASA space shuttle plant growth experiments has
contributed to the development of medical devices such as award-winning WARP 10,
a hand-held, high-intensity, LED unit developed by Quantum Devices Inc. The WARP
10 is intended for the temporary relief of minor muscle and joint pain, arthritis,
stiffness, and muscle spasms, and also promotes muscle relaxation and increases local blood
circulation. The WARP 10 is being used by the U.S. Department of Defense and U.S. Navy as a
noninvasive “soldier self-care” device that aids front-line forces with first aid for minor injuries and pain,
thereby improving endurance in combat. The next-generation WARP 75 has been used to relieve pain
in bone marrow transplant patients, and will be used to combat the symptoms of bone atrophy, multiple
sclerosis, diabetic complications, Parkinson’s disease, and in a variety of ocular applications.
(Spinoff 2005, 2008)

Infrared Ear Thermometers 

Diatek Corporation and NASA developed an aural thermometer, which weighs only 8 ounces and
uses infrared astronomy technology to measure the amount of energy emitted by the eardrum, the
same way the temperature of stars and planets is measured. This method avoids contact with
mucous membranes, virtually eliminating the possibility of cross infection, and permits rapid
temperature measurement of newborn, critically-ill, or incapacitated patients. NASA supported the
Diatek Corporation, a world leader in electronic thermometry, through the Technology Affiliates
Program. (Spinoff 1991)
Page 58

Artificial Limbs

NASA’s continued funding, coupled with its collective innovations in robotics and shock-
absorption/comfort materials are inspiring and enabling the private sector to create new and
better solutions for animal and human prostheses. Advancements such as Environmental
Robots Inc.’s development of artificial muscle systems with robotic sensing and actuation
capabilities for use in NASA space robotic and extravehicular activities are being adapted to
create more functionally dynamic artificial limbs (Spinoff 2004). Additionally, other private-sector
adaptations of NASA’s temper foam technology have brought about custom-moldable materials offering
the natural look and feel of flesh, as well as preventing friction between the skin and the prosthesis, and
heat/moisture buildup. (Spinoff 2005)

Ventricular Assist Device

Collaboration between NASA, Dr. Michael DeBakey, Dr. George Noon, and MicroMed
Technology Inc. resulted in a lifesaving heart pump for patients awaiting heart transplants. The
MicroMed DeBakey ventricular assist device (VAD) functions as a “bridge to heart transplant” by
pumping blood throughout the body to keep critically ill patients alive until a donor heart is
available. Weighing less than 4 ounces and measuring 1 by 3 inches, the pump is
approximately one-tenth the size of other currently marketed pulsatile VADs. This makes it less
invasive and ideal for smaller adults and children. Because of the pump’s small size, less than 5
percent of the patients implanted developed device-related infections. It can operate up to 8 hours on
batteries, giving patients the mobility to do normal, everyday activities. (Spinoff 2002)

Transportation

Anti-Icing Systems

NASA funding under the Small Business Innovation Research (SBIR) program and work with
NASA scientists advanced the development of the certification and integration of a
thermoelectric deicing system called Thermawing, a DC-powered air conditioner for single-
engine aircraft called Thermacool, and high-output alternators to run them both. Thermawing, a
reliable anti-icing and deicing system, allows pilots to safely fly through ice encounters and
provides pilots of single-engine aircraft the heated wing technology usually reserved for larger,
jet-powered craft. Thermacool, an innovative electric air conditioning system, uses a new compressor
whose rotary pump design runs off an energy-efficient, brushless DC motor and allows pilots to use the
air conditioner before the engine even starts. (Spinoff 2007)

Highway Safety
Safety grooving, the cutting of grooves in concrete to increase traction and prevent injury, was
first developed to reduce aircraft accidents on wet runways. Represented by the International
Grooving and Grinding Association, the industry expanded into highway and pedestrian
applications. The technique originated at Langley Research Center, which assisted in testing
the grooving at airports and on highways. Skidding was reduced, stopping distance decreased, and a
vehicle’s cornering ability on curves was increased. The process has been extended to animal holding
pens, steps, parking lots, and other potentially slippery surfaces. (Spinoff 1985)
Page 59

Improved Radial Tires

Goodyear Tire and Rubber Company developed a fibrous material, five times stronger than
steel, for NASA to use in parachute shrouds to soft-land the Vikings on the Martian surface. The
fiber’s chain-like molecular structure gave it incredible strength in proportion to its weight.
Recognizing the increased strength and durability of the material, Goodyear expanded the
technology and went on to produce a new radial tire with a tread life expected to be 10,000 miles
greater than conventional radials. (Spinoff 1976)

Chemical Detection

NASA contracted with Intelligent Optical Systems (IOS) to develop moisture- and pH-sensitive
sensors to warn of potentially dangerous corrosive conditions in aircraft before significant
structural damage occurs. This new type of sensor, using a specially manufactured optical fiber
whose entire length is chemically sensitive, changes color in response to contact with its target.
After completing the work with NASA, IOS was tasked by the U.S. Department of Defense to further
develop the sensors for detecting chemical warfare agents and potential threats, such as toxic industrial
compounds and nerve agents, for which they proved just as successful. IOS has additionally sold the
chemically sensitive fiber optic cables to major automotive and aerospace companies, who are finding
a variety of uses for the devices such as aiding experimentation with nontraditional power sources, and
as an economical “alarm system” for detecting chemical release in large facilities. (Spinoff 2007)

Public Safety

Video Enhancing and Analysis Systems

Intergraph Government Solutions developed its Video Analyst System (VAS) by building on
Video Image Stabilization and Registration (VISAR) technology created by NASA to help FBI
agents analyze video footage. Originally used for enhancing video images from nighttime
videotapes made with hand-held camcorders, VAS is a state-of-the-art, simple, effective, and
affordable tool for video enhancement and analysis offering benefits such as support of full-
resolution digital video, stabilization, frame-by-frame analysis, conversion of analog video to
digital storage formats, and increased visibility of filmed subjects without altering underlying footage.
Aside from law enforcement and security applications, VAS has also been adapted to serve the military
for reconnaissance, weapons deployment, damage assessment, training, and mission debriefing.
(Spinoff 2001)

Land Mine Removal

Due to arrangements such as the one between Thiokol Propulsion and NASA that permits
Thiokol to use NASA’s surplus rocket fuel to produce a flare that can safely destroy land mines,
NASA is able to reduce propellant waste without negatively impacting the environment, and
Thiokol is able to access the materials needed to develop the Demining Device flare. The
Demining Device flare uses a battery-triggered electric match to ignite and neutralize land mines
Page 60

in the field without detonation. The flare uses solid rocket fuel to burn a hole in the mine’s case and
burn away the explosive contents so the mine can be disarmed without hazard. (Spinoff 2000)
Fire-Resistant Reinforcement

Built and designed by Avco Corporation, the Apollo heat shield was coated with a material
whose purpose was to burn and thus dissipate energy during reentry while charring, to form a
protective coating to block heat penetration. NASA subsequently funded Avco’s development of
other applications of the heat shield, such as fire-retardant paints and foams for aircraft, which
led to the world’s first intumescent epoxy material, which expands in volume when exposed to heat or
flames, acting as an insulating barrier and dissipating heat through burn-off. Further innovations based
on this product include steel coatings devised to make high-rise buildings and public structures safer by
swelling to provide a tough and stable insulating layer over the steel for up to 4 hours of fire protection,
ultimately to slow building collapse and provide more time for escape. (Spinoff 2006)

Firefighter Gear

Firefighting equipment widely used throughout the United States is based on a NASA
development that coupled Agency design expertise with lightweight materials developed for the
U.S. Space Program. A project that linked NASA and the National Bureau of Standards resulted
in a lightweight breathing system including face mask, frame, harness, and air bottle, using an
aluminum composite material developed by NASA for use on rocket casings. Aerospace
technology has been beneficially transferred to civil-use applications for years, but perhaps the
broadest fire-related technology transfer is the breathing apparatus worn by firefighters for protection
from smoke inhalation injury. Additionally, radio communications are essential during a fire to
coordinate hose lines, rescue victims, and otherwise increase efficiency and safety. NASA’s
inductorless electronic circuit technology contributed to the development of a lower-cost, more rugged,
short-range two-way radio now used by firefighters. NASA also helped develop a specialized mask
weighing less than 3 ounces to protect the physically impaired from injuries to the face and head, as
well as flexible, heat-resistant materials—developed to protect the space shuttle on reentry—which are
being used both by the military and commercially in suits for municipal and aircraft-rescue firefighters.
(Spinoff 1976)

Consumer, Home, and Recreation

Temper Foam
As the result of a program designed to develop a padding concept to improve crash protection
for airplane passengers, Ames Research Center developed a foam material with unusual
properties. The material is widely used and commonly known as temper foam or “memory
foam.” The material has been incorporated into a host of widely used and recognized products
including mattresses, pillows, military and civilian aircraft, automobiles and motorcycles, sports
safety equipment, amusement park rides and arenas, horseback saddles, archery targets, furniture,
and human and animal prostheses. Its high-energy absorption and soft characteristics not only offer
superior protection in the event of an accident or impact, but enhanced comfort and support for
passengers on long flights or those seeking restful sleep. Today, temper foam is being employed by
NASCAR to provide added safety in racecars. (Spinoff 1976, 1977, 1979, 1988, 1995, 2002, 2005)
Page 61

Enriched Baby Food


Commercially available infant formulas now contain a nutritional enrichment ingredient that traces
its existence to NASA-sponsored research that explored the potential of algae as a recycling agent
for long-duration space travel. The substance, formulated into the products life’sDHA and
life’sARA, can be found in over 90 percent of the infant formulas sold in the United States, and are
added to the infant formulas sold in over 65 additional countries. The products were developed and are
manufactured by Martek Biosciences Corporation, which has pioneered the commercial development of
products based on microalgae; the company’s founders and principal scientists acquired their expertise
in this area while working on the NASA program. (Spinoff 1996, 2008)

Portable Cordless Vacuums

Apollo and Gemini space mission technologies created by Black & Decker have helped change
the way we clean around the house. For the Apollo space mission, NASA required a portable,
self-contained drill capable of extracting core samples from below the lunar surface. Black &
Decker was tasked with the job, and developed a computer program to optimize the design of
the drill’s motor and insure minimal power consumption. That computer program led to the development
of a cordless miniature vacuum cleaner called the Dustbuster. (Spinoff 1981)

Freeze Drying Technology

In planning for the long-duration Apollo missions, NASA conducted extensive research into
space food. One of the techniques developed was freeze drying—Action Products
commercialized this technique, concentrating on snack food. The foods are cooked, quickly
frozen, and then slowly heated in a vacuum chamber to remove the ice crystals formed by the
freezing process. The final product retains 98 percent of its nutrition and weighs only 20 percent
of its original weight. Today, one of the benefits of this advancement in food preparation
includes simple nutritious meals available to handicapped and otherwise homebound senior adults
unable to take advantage of existing meal programs sponsored by government and private
organizations. (Spinoff 1976, 1994)

Environmental and Agricultural Resources


Harnessing Solar Energy
Homes across the country are now being outfitted with modern, high-performance, low-cost,
single crystal silicon solar power cells that allow them to reduce their traditional energy
expenditures and contribute to pollution reduction. The advanced technology behind these solar
devices—which are competitively-priced and provide up to 50 percent more power than
conventional solar cells—originated with the efforts of a NASA-sponsored 28-member coalition
of companies, government groups, universities, and nonprofits forming the Environmental Research
Aircraft and Sensor Technology (ERAST) Alliance. ERAST’s goal was to foster the development of
remotely piloted aircraft intended to fly unmanned at high altitudes for days at a time, requiring
advanced solar power sources that did not add weight. As a result, SunPower Corporation created
the most advanced silicon-based cells available for terrestrial or airborne applications. (Spinoff2005)
Page 62

Pollution Remediation

A product using NASA’s microencapsulating technology is available to consumers and industry


enabling them to safely and permanently clean petroleum-based pollutants from water. The
microencapsulated wonder, Petroleum Remediation Product or “PRP,” has revolutionized the
way oil spills are cleaned. The basic technology behind PRP is thousands of microcapsules—
tiny balls of beeswax with hollow centers. Water cannot penetrate the microcapsule’s cell, but oil
is absorbed right into the beeswax spheres as they float on the water’s surface. Contaminating
chemical compounds that originally come from crude oil (such as fuels, motor oils, or petroleum
hydrocarbons) are caught before they settle, limiting damage to ocean beds. (Spinoff 1994, 2006)

Water Purification

NASA engineers are collaborating with qualified companies to develop a complex system of
devices intended to sustain the astronauts living on the International Space Station and, in the
future, those who go on to explore the Moon. This system, tentatively scheduled for launch in
2008, will make use of available resources by turning wastewater from respiration, sweat, and
urine into drinkable water. Commercially, this system is benefiting people all over the world who
need affordable, clean water. By combining the benefits of chemical adsorption, ion exchange,
and ultra-filtration processes, products using this technology yield safe, drinkable water from the most
challenging sources, such as in underdeveloped regions where well water may be heavily
contaminated. (Spinoff 1995, 2006)

Computer Technology

Better Software

From real-time weather visualization and forecasting, high-resolution 3-D maps of the Moon and
Mars, to real-time tracking of the International Space Station and the space shuttle, NASA is
collaborating with Google Inc. to solve a variety of challenging technical problems ranging from
large-scale data management and massively distributed computing, to human-computer
interfaces—with the ultimate goal of making the vast, scattered ocean of data more accessible
and usable. With companies like InterSense, NASA continues to fund and collaborate on other
software advancement initiatives benefiting such areas as photo/video image enhancement, virtual-
reality/design, simulation training, and medical applications. (Spinoff 2005)

Structural Analysis

NASA software engineers have created thousands of computer programs over the decades
equipped to design, test, and analyze stress, vibration, and acoustical properties of a broad
assortment of aerospace parts and structures (before prototyping even begins). The NASA
Structural Analysis Program, or NASTRAN, is considered one of the most successful and
widely-used NASA software programs. It has been used to design everything from Cadillacs to
Page 63

roller coaster rides. Originally created for spacecraft design, NASTRAN has been employed in a host of
non-aerospace applications and is available to industry through NASA’s Computer Software
Management and Information Center (COSMIC). COSMIC maintains a library of computer programs
from NASA and other government agencies and offers them for sale at a fraction of the cost of
developing a new program, benefiting companies around the world seeking to solve the largest, most
difficult engineering problems. (Spinoff 1976, 1977, 1978, 1979, 1980, 1981, 1982, 1986, 1988, 1990,
1991, 1998)

Refrigerated Internet-Connected Wall Ovens

Embedded Web Technology (EWT) software—originally developed by NASA for use by


astronauts operating experiments on available laptops from anywhere on the International
Space Station—lets a user monitor and/or control a device remotely over the Internet. NASA
supplied this technology and guidance to TMIO LLC, who went on to develop a low-cost, real-
time remote control and monitoring of a new intelligent oven product named “ConnectIo.” With
combined cooling and heating capabilities, ConnectIo provides the convenience of being able to
store cold food where it will remain properly refrigerated until a customized pre-programmable cooking
cycle begins. The menu allows the user to simply enter the dinner time, and the oven automatically
switches from refrigeration to the cooking cycle, so that the meal will be ready as the family arrives
home for dinner. (Spinoff 2005)

Industrial Productivity

Powdered Lubricants

NASA’s scientists developed a solid lubricant coating material that is saving the manufacturing
industry millions of dollars. Developed as a shaft coating to be deposited by thermal spraying to
protect foil air bearings used in oil-free turbomachinery, like gas turbines, this advanced coating,
PS300, was meant to be part of a larger project: an oil-free aircraft engine capable of operating
at high temperatures with increased reliability, lowered weight, reduced maintenance, and increased
power. PS300 improves efficiency, lowers friction, reduces emissions, and has been used by NASA in
advanced aeropropulsion engines, refrigeration compressors, turbochargers, and hybrid electrical
turbogenerators. ADMA Products has found widespread industrial applications for the material.
(Spinoff 2005)

Improved Mine Safety

An ultrasonic bolt elongation monitor developed by a NASA scientist for testing tension and
high-pressure loads on bolts and fasteners has continued to evolve over the past three
decades. Today, the same scientist and Luna Innovations are using a digital adaptation of this
same device for a plethora of different applications, including non-destructive evaluation of
railroad ties, groundwater analysis, radiation dosimetry, and as a medical testing device to assess
levels of internal swelling and pressure for patients suffering from intracranial pressure and
compartment syndrome, a painful condition that results when pressure within muscles builds to
dangerous levels. The applications for this device continue to expand. (Spinoff 1978, 2005, 2008)
Page 64

Food Safety Systems

Faced with the problem of how and what to feed an astronaut in a sealed capsule under
weightless conditions while planning for human space flight, NASA enlisted the aid of The
Pillsbury Company to address two principal concerns: eliminating crumbs of food that might
contaminate the spacecraft’s atmosphere and sensitive instruments, and assuring absolute
freedom from potentially catastrophic disease-producing bacteria and toxins. Pillsbury
developed the Hazard Analysis and Critical Control Point (HACCP) concept, potentially one of the most
far-reaching space spinoffs, to address NASA’s second concern. HACCP is designed to prevent food
safety problems rather than to catch them after they have occurred. The U.S. Food and Drug
Administration has applied HACCP guidelines for the handling of seafood, juice, and dairy products.
(Spinoff 1991)
Page 65

The Space Race

In October 1957, the Soviet Union launched Sputnik 1, the world's first artificial satellite. The 184-pound, 22.5-inch
sphere orbited the earth once every 96 minutes. Sputnik transmitted radio signals for 21 days and later burned up in
the earth's atmosphere. A second Sputnik, launched in November 1957, carried a dog named Laika. This satellite
weighed a thousand pounds.

In December, the United States made its first attempt at a satellite launch. A Navy Vanguard rocket, carrying a
payload only one-fortieth the size of Sputnik, lifted a few feet off of its launch pad before falling back to earth. It
exploded in a ball or orange flames and black smoke. Premier Khrushchev boasted that "America sleeps under a
Soviet moon." Because Sputnik was launched on an intercontinental ballistic missile, Soviet leaders cited it as proof
that they could deliver hydrogen bombs at will.

Sputnik's launch meant that the Cold War competition between the Soviet Union and the United States would take
place, not only on earth, but also in outer space. Americans, who thought of themselves as the world's technology
pacesetters, felt vulnerable; a sensation that was reinforced in 1959, when the Soviet Union fired the first rockets to
circle the moon and brought back pictures of its dark side. In April 1961, the Soviets launched the first manned
spaceship into orbit, piloted by 27-year-old Soviet Cosmonaut Yuri Gagarin. In 1966, the Soviets were the first to
land an unmanned vehicle on the moon.

Sputnik led Congress to pass a series of massive federal aid-to-education measures. Science became a priority in
schools and universities. Soviet space successes led President John F. Kennedy to tell a joint session of Congress in
May 1961 that the United States would land a man on the moon and bring him home by the end of the 1960s.

The U.S. space program passed through several stages. There were six one-man flights in the Mercury program,
which expanded from suborbital flights to an orbital mission that lasted more than 34 hours. The Gemini program
followed with ten two-man flights, including the first spacewalk and the rendezvous and docking of two spacecraft.
One mission lasted 14 days.

Then disaster struck. In January 1967, a fire destroyed a prototype command module, killing the crew of Apollo 1.
Four manned flights in late 1968 and early 1969 paved the way for a historic launch of Apollo 11. The launch was
witnessed by a million people assembled along Florida's beaches.

At 4:17 p.m. Eastern time, July 20, 1969, astronaut Neil Armstrong announced: "Houston...the Eagle has landed."
The landing vehicle had less than a minutes worth of fuel remaining. The astronauts spent only two-and-a-half hours
walking on the lunar surface.

Eight years after President Kennedy had called on the United States to land a man on the moon, the mission had
been successfully accomplished. A total of 400,000 American employees from 20,000 companies had worked directly
on the Apollo program. The cost was $25 billion.

Today, more than half of all Americans are too young to remember that historic mission. At the Johnson Space
Center in Houston, a Saturn V rocket--bigger than a 40-story building--lies on the ground. It is not a mockup. It was
intended to carry Apollo 18 to the moon. But due to budget cutbacks, the mission was never carried out.
Page 66

Conquering Space

Prior to 1812, westward expansion had proceeded slowly. Most Americans were nestled along the Atlantic coastline.
More than two-thirds of the new nation's population still lived within 50 miles of the Atlantic seaboard, and the
center
of population rested within 18 miles of Baltimore. Only two roads cut across the Allegheny Mountains, and no more
than half a million pioneers had moved as far west as Kentucky, Tennessee, Ohio, or the western portion of
Pennsylvania. Cincinnati was a town of 15,000 people; Buffalo and Rochester, New York, did not yet exist.
Kickapoos,
Miamis, Wyandots, and other Indian peoples populated the areas that would become the states of Illinois, Indiana,
Michigan, and Wisconsin, while Cherokees, Chickasaws, Choctaws, and Creeks considered the future states of
Alabama, Mississippi, and western Georgia their territory.

Between 1803, when Ohio was admitted to the Union, and the beginning of the War of 1812, not a single new state
was carved out of the west. Thomas Jefferson estimated in 1803 that it would be a thousand years before settlers
occupied the region east of the Mississippi.

The end of the War of 1812 unleashed a rush of pioneers to Indiana, Illinois, Ohio, northern Georgia, western North
Carolina, Alabama, Mississippi, Louisiana, and Tennessee. Congress quickly admitted five states to the Union:
Louisiana in 1812, Indiana in 1816, Mississippi in 1817, Illinois in 1818, and Alabama in 1819.

Farmers demanded that Congress revise legislation to make it easier to obtain land. Originally, Congress viewed
federal lands as a source of revenue, and public land policies reflected that view. Under a policy adopted in 1785 and
reaffirmed in 1796, the federal government only sold land in blocks of at least 640 acres. Although the minimum
allotment was reduced to 320 acres in 1800, federal land policy continued to retard sales and concentrate ownership
in the hands of a few large land companies and wealth speculators.

In 1820, Congress sought to make it easier for farmers to purchase homesteads in the West by selling land in small
lots suitable for operation by a family. Congress reduced the minimum allotment offered for sale from 320 to 80
acres. The minimum price per acre fell from $2 to $1.25. In 1796, a pioneer farmer purchasing a western farm from
a federal land office had to buy 640 acres costing $1280. In 1820, a farmer could purchase 80 acres for just $100.
The second Bank of the United States encouraged land purchases by liberally extending credit. The result was a
boom in land sales. For a decade, the government sold approximately a million acres of land annually.

Westward expansion also created a demand to expand and improve the nation's roads and canals. In 1808, Albert
Gallatin, Thomas Jefferson's Treasury secretary, proposed a $20 million program of canal and road construction. As a
result of state and sectional jealousies and charges that federal aid to transportation was unconstitutional, the federal
government funded only a single turnpike, the National Road, at this time stretching from Cumberland, Maryland, to
Wheeling, Virginia (later West Virginia), but much later extending westward from Baltimore through Ohio and Indiana
to Vandalia, Illinois.

In 1816, John C. Calhoun introduced a new proposal for federal aid for road and canal construction. Failure to link the
nation together with an adequate system of transportation would, Calhoun warned, lead "to the greatest of
calamities--disunion." "Let us," he exclaimed, "bind the republic together with a perfect system of roads and canals.
Let us conquer space." Narrowly, Calhoun's proposal passed. But on the day before he left office, Madison vetoed the
bill on constitutional grounds.

Despite this setback, Congress did adopt major parts of the nationalist neo-Hamiltonian economic program. It had
established a second Bank of the United States to provide a stable means of issuing money and a safe depository for
federal funds. It had enacted a tariff to raise duties on foreign imports and guard American industries from low-cost
competition. It had also instituted a new public land policy to encourage western settlement. In short, Congress had
translated the spirit of national pride and unity that the nation felt after the War of 1812 into a legislative program
that placed the national interest above narrow sectional interests.
Page 67

Courtship in Early America

Late in the winter of 1708/9, Samuel Gerrish, a Boston bookseller, began to court Mary Sewall, the 18-year old
daughter of Puritan magistrate Samuel Sewall. Judge Sewall was a conscientious father, and like many Puritan
fathers believed that he had a right and duty to take an active role in his daughter's selection of a spouse. He had
heard ‘various and uncertain reports’ that young Gerrish had previously courted other women and immediately
dashed off a letter to Gerrish’s father demanding ‘the naked Truth.’ Only after receiving a satisfactory reply did Judge
Sewall permit the courtship to continue. In August, after a whirlwind six month courtship, the couple married, but the
marriage was cut tragically short l5 months later when young Mary died in childbirth.

A hundred twenty-nine years later, in 1838, another couple began their courtship. Theodore Dwight Weld, a 39-year
old abolitionist, wrote a letter to Angelina Grimke, the daughter of a wealthy, slaveholding South Carolina family who
had turned against slavery, in which he disclosed ‘that for a long time you have had my whole heart.’ He had ‘no
expectation and almost no hope that [his] feelings are in any degree RECIPROCATED BY YOU.’ Nevertheless, he
asked her to reveal her true feelings.

Angelina replied by acknowledging her own love for him: ‘I feel, my Theodore, that we are the two halves of one
whole, a twain one, two bodies animated by one soul and that the Lord has given us to each other.’

Like many early nineteenth century couples, Theodore and Angelina devoted much of their courtship to disclosing
their personal faults and dissecting their reasons for marriage. They considered romance and passion childish and
unreliable motives for marriage and instead sought a love that was more tender and rational. In his love letters,
Theodore listed his flaws and worried that he was not deserving of Angelina's love. He was a ‘vile groveling selfish
wretch’ - reckless, impatient, careless in appearance, and poorly educated. Angelina responded by confessing her
own faults - her temper, her pride, and the fact that she had once loved another man - and revealed her fear that
the vast majority of men ‘believe most seriously that women were made to gratify their animal appetites, expressly
to minister to their pleasure.’ Only after Theodore and Angelina were convinced that they were emotionally ready for
‘the most important step of Life,’ did they finally marry.

Between 1708/9, when Samuel Gerrish courted Mary Sewall, and 1835, when Theodore Weld courted Angelina
Grimke, the rituals of courtship underwent profound changes. Parental influence and involvement in the selection of
their children's marriage partner visibly declined. Young women and men were increasingly free to pick or reject a
spouse with little parental interference. At the same time that courtship grew freer, however, marriage became an
increasingly difficult transition point, particularly for women, and more and more women elected not to marry at all.

In seventeenth and early eighteenth century New England, courtship was not simply a personal, private matter. The
law gave parents ‘the care and power...for the disposing of their Children in Marriage’ and it was expected that they
would take an active role overseeing their child's choice of a spouse. A father in Puritan New England had a legal
right to determine which men would be allowed to court his daughters and a legal responsibility to give or withhold
his consent from a child's marriage. A young man who courted a woman without her father’s permission might be
sued for inveigling the woman's affections.

Parental involvement in courtship was expected because marriage was not merely an emotional relationship between
individuals but also a property arrangement among families. A young man was expected to bring land or some other
form of property to a marriage while a young woman was expected to bring a dowry worth about half as much.

In most cases, Puritan parents played little role in the actual selection of a spouse (although Judge Sewall did initiate
the courtship between his son Joseph and a neighbor named Elizabeth Walley). Instead, they tended to influence the
timing of marriage. Since Puritan children were expected to bring property to marriage, and Puritans fathers were
permitted wide discretion in when they distributed property to their children, many sons and daughters remained
economically dependent for years, delaying marriages until a relatively late age.

Today, love is considered the only legitimate reason for marriage. Puritan New Englanders, in sharp contrast, did not
regard love as a necessary precondition for marriage. Indeed, they associated romantic love with immaturity and
impermanence. True love, the Puritans believed, would appear following marriage. A proper marriage, in their view,
was based not on love and affection, but on rational considerations of property, compatibility, and religious piety.
Thus, it was considered acceptable for a young man to pursue "’a goodly lass with aboundation of money,’ so long as
he could eventually love his wife-to-be.

By the middle of the eighteenth century, parental influence over the choice of a spouse had sharply declined. One
Page 68

indication of a decline in parental control was a sudden upsurge in the mid-eighteenth century the number of brides
who were pregnant when they got married. In the seventeenth century, fathers - supported by local churches and
courts - exercised close control over their childrens’ sexual behavior and kept sexual intercourse prior to marriage at
extremely low levels. The percentage of women who bore a first child less than eight-and-a half months after
marriage was below ten percent. By the middle of the eighteenth century, the figure had shot up to over forty
percent.

Another indicator of a decline in paternal authority was an increase in children's discretion in deciding whom and
when to marry. By the middle of the eighteenth century, well before the onset of the American Revolution, the ability
of fathers to delay their sons' marriages until their late twenties had eroded.

Greater freedom in selection of a spouse was also apparent in a gradual breakdown in a seventeenth- and early
eighteenth-century pattern in which the order of a son's birth was closely connected to the economic status of his
future spouse. Although most families in early New England did not practice strict primogeniture - the right of
inheritance belonging to the eldest son - many families did assign older sons a larger share of resources than
younger children. Receiving larger inheritances themselves, eldest sons tended to marry daughters of wealthier
families. By mid-century, a closed connection between birth order and a spouse’s economic status had gradually
declined.

By the middle of the eighteenth century, other signs of weakening parental control over marriage were visible. In
seventeenth century Plymouth, the brothers and sisters of one family frequently married the sisters and brothers of
another. After 1760 this pattern gave way to marriages based on individual choice. In one small Massachusetts town,
greater freedom was evident in the growing ease with which younger daughters were able to wed before their older
sisters.

As parental influence over courtship declined, a new romantic ideal of love arose. In the years just before the
Revolution, a flood of advice books, philosophical treaties, and works of fiction helped to popularize revolutionary
new ideas about courtship and marriage. Readers learned that love was superior to property as a basis for marriage
and that marriage should be based on mutual sympathy, affection, and friendship. Rather than choosing spouses on
economic grounds, young people were told to select their marriage partner on the more secure basis of love and
compatibility. In a survey of all magazines published during the 30 years before the Revolution, one issue out of four
contained a reference to romantic love as the proper basis of marriage; during the next twenty years the number of
references to romantic love tripled.

The heightened emphasis attached to romantic love can be seen in in the proliferation of new kinds of love letters.
Courtship letters changed by the nineteenth century from brief notes to longer, more effusive expositions of feelings
and emotions. Seventeenth century Puritans tended to moderate expression of affection in love letters. A letter from
a Westfield, Connecticut, minister to his sweetheart was not atypical. After describing his passion for her as ‘a golden
ball of pure fire,’ he added that his affection ‘must be kept within bounds too. For it must be subordinate to God’s
Glory.’

By the late eighteenth century, love letters, particularly those written by men, had grown more expansive and less
formal. Instead of addressing their beloved in highly formalized terms, lovers began to use such terms of endearment
as ‘dearest’ or ‘my beloved.’ In their love letters, couples described feelings of affection that were deeply romantic. In
1844, Alexander Rice, a study at Union College in Schnechtady, New York, described the feeling that overcame him
when he first met his fiance, Augusta McKim. ‘I felt...as I had never felt in the presence of a lady before and there
seemed to be a kind of [direction] saying to me that I was now meeting her whom it was appointed should be my
special object of affection and love.’

Yet even in deeply impassioned love letters such as this one, writers stressed that their love was not motivated solely
by transient emotions, but by mutuality of tastes, companionship, trust, and shared interests. Alexander Rice made
this point in typical terms: emotion alone would not have led him ‘blindly forward had not I discovered in you those
elements of character and those qualities of mind which my judgment approved.’ The kind of love that early
nineteenth century Americans sought was not transient passion, declared Henry Poor, a young Bangor, Maine,
attorney, in a letter to his fiancé, but a higher kind of love, ‘the kind that seeks its gratification in mutual sympathy.’

The most surprising fact disclosed in early nineteenth century love letters is that courting couples were less sexually
restrained than the myth of Victorian sexual values would suggest. Although the colonial custom of bundling -
according to which a courting couple shared a common bed without undressing - had fallen into disuse by 1800,
physical displays of affection remained an important part of courtship. Seventeen-year old Lester Frank Ward, who
would later become one of the foremost late nineteenth century American sociologists, recorded in his diary a visit to
his fiancé’s house: ‘my beloved and I went down, made a fire, and sat down to talk and kiss and embrace and bathe
Page 69

in love.’ Other surviving love letters also suggest that physical affection and sexual intimacy played an important role
in many courtships. Mary Butterfield of Racine, Wisconsin, described her feelings after spending an evening with her
fiancé in the Racine Hotel: "’I was so glad afterwards when you seemed so sincerely pleased & happy - so satisfied
with me.’ Still, her feelings were confused. ‘...It was a pleasure and yet women so naturally guard such treasures
with jealousy & care, that it seems very ‘strange’ to yield them even to the ‘best loved one’ who has a claim to such
kindnesses. So of course it seemed very ‘strange’ to me."’

Yet ironically at the same time that courting couples were often so open in their expression of their affection, young
women, in particular, more openly disclosed their fears of marriage. ‘There can be no medium in the wedded state,’
noted one Massachusetts woman. ‘It must either be happy or miserable.’ While men were likely to stress the
pleasures marriage would bring, women, in their correspondence, expressed fears about marriage. It was a ‘sad,
sour, sober beverage bringing ‘some joys but many crosses.’ In their courtship letters, women often associated
marriage with the loss of their liberty - often linking marriage with loss of self - and forebodings about the dangers of
childbearing - often omitting children from their fantasies of an ideal marriage.

Marriage was such an awesome step that few women in the late eighteenth or early nineteenth centuries entered into
the relationship lightly. After her husband died in 1767, Mary Fish, a Connecticut widow, remained unmarried for nine
years despite at least three proposals of marriage. She finally remarried in 1776, but only after her future husband
read a document Mary had composed describing the qualities she wanted in a spouse. Entitled ‘Portrait of a Good
Husband,’ the document stated that he should ‘gratify’ her "’reasonable inclinations,’ enter into her griefs and
participate in her jobs, should not be jealous or abuse his wife or stepchildren, and should not mismanage or
dissipate her inheritance.

To move from ‘girlhood’ to housewifery had become a rite of passage so difficult that many young women
experienced a ‘marriage trauma’ before taking or failing to take the step. Many women wrote that they ‘trembled’ as
their wedding day approached, that their ‘spirits were much depressed,’ and their minds were ‘loaded with doubts
and fears.’ One woman, Sarah Williams, noted that she felt ‘rather depressed than elevated’ at her impending
marriage and Catharine Beecher, a prominent educator, worried that after her betrothed got over the ‘novelty’ of
marriage he would be ‘so engrossed in science and study as to forget I existed.’

In colonial New England, marriage was regarded as a social obligation and an economic necessity, and virtually all
adults married. But by the early nineteenth century, the number of unmarried women increased to an unprecedented
11 percent.

Marriage became a far more deliberate act than it had been in the past. Marriage was regarded by young women in a
new way - as a closing off of freedoms enjoyed in girlhood. Between 1780 and 1820, young women between the ages
of l4 and 27 enjoyed unprecedented opportunities to attend school and to earn cash income outside of their parents’
home. Many prospective brides who did eventually marry hesitated to leave the relative independence they had
enjoyed in girlhood.

At the same time that marriage become a more difficult transition point for young women, the rituals surrounding
engagement and marriage radically changed. By the 1840s, a host of elaborate, formal new rituals had arisen, which
helped young women and men maneuver the difficult steps toward marriage.

To signify their intention to marry, men and women began to give each other engagement rings. (Over time, it
became more common for a man to present a ring to his fiancé). Families began to announce their children's
engagement in letters to friends and family or formal newspaper announcements.

At the same time, marriage ceremonies increasingly became larger and more formal affairs, attended not simply by
near kin (which had been the custom during the colonial period) but by a much larger number of family members
and friends. Guests received printed invitations to the ceremony and were, in turn, expected to send wedding gifts.

It was during the 1840s that many of the rituals that still characterize wedding ceremonies today first became
widespread, such as the custom that the bride wear a veil and a white dress and that she be assisted by formally
costumed attendants, that the bridegroom present his bride with a wedding ring, and that the bride and groom and
their guests eat a white wedding cake.

These rituals were intended to mark off marriage as an especially beautiful and solemn occasion, the supreme
occurrence of life. The bride was dressed in white to signify her purity and virtue. At a time when civil marriage was
becoming prevalent on the European continent, it was only in Britain and America, the twin archetypes of the
emerging market economy that a sacramental conception of marriage triumphed.
Page 70

Early Industrialization

In the 1820s and 1830s, America became the world's leader in adopting mechanization, standardization, and mass

production. Manufacturers began to adopt labor-saving machinery that allowed workers to produce more goods at
lower costs. So impressed were foreigners with these methods of manufacture that they called them the "American
system of production."
The single most important figure in the development of the American system was Eli Whitney, the inventor of the
cotton gin. In 1798, Whitney persuaded the U.S. government to award him a contract for 10,000 muskets to be
delivered within two years. Until then, rifles had been manufactured by skilled artisans, who made individual parts by
hand, and then carefully fitted the pieces together. At the time Whitney made his offer, the federal arsenal at
Springfield, Massachusetts, was capable of producing only 245 muskets in two years. Whitney's idea was to develop
precision machinery that would allow a worker with little manual skill to manufacture identical gun parts that would
be interchangeable from one gun to another. The first year he produced 500 muskets.
In 1801, in order to get an extension on his contract, Whitney demonstrated his new system of interchangeable parts
to President John Adams and Vice President Thomas Jefferson. He disassembled ten muskets and put ten new
muskets together out of the individual pieces. His system was a success. (In fact, the muskets used in the
demonstration were not assembly line models; they had been carefully hand-fitted beforehand).

Other industries soon adopted the "American system of manufacturing." As early as 1800 manufacturers of wooden
clocks began to use interchangeable parts. Makers of sewing machines used mass production techniques as early as
1846, and the next year, manufacturers mechanized the production of farm machinery.

Innovation was not confined to manufacturing. During the years following the War of 1812, American agriculture
underwent a transformation nearly as profound and far-reaching as the revolution taking place in industry. During
the 18th century, most farm families were largely self-sufficient. They raised their own food, made their own clothes
and shoes, and built their own furniture. Cut off from markets by the high cost of transportation, farmers sold only a
few items, like whiskey, corn, and hogs, in exchange for such necessities as salt and iron goods. Farming methods
were primitive. With the exception of plowing and furrowing, most farm work was performed by hand. European
travelers deplored the backwardness of American farmers, their ignorance of the principles of scientific farming, their
lack of labor-saving machinery, and their wastefulness of natural resources. Few farmers applied manure to their
fields as fertilizer or practiced crop rotation. As a result, soil erosion and soil exhaustion were commonplace.
Commented one observer: "Agriculture in the South does not consist so much in cultivating land as in killing it."

Beginning in the last decade of the 18th century, agriculture underwent profound changes. Some farmers began to
grow larger crop surpluses and to specialize in cash crops. A growing demand for cotton for England's textile mills led
to the introduction of long-staple cotton from the West Indies into the islands and lowlands of Georgia and South
Carolina. Eli Whitney's invention of the cotton gin in 1793--which permitted an individual to clean 50 pounds of short-
staple cotton in a single day, 50 times more than could be cleaned by hand--made it practical to produce short-staple
cotton in the South (which was much more difficult to clean and process than long-staple cotton). Other cash crops
raised by southern farmers included rice, sugar, flax for linen, and hemp for rope fibers. In the Northeast, the growth
of mill towns and urban centers created a growing demand for hogs, cattle, sheep, corn, wheat, wool, butter, milk,
cheese, fruit, vegetables, and hay to feed horses.

As production for the market increased, farmers began to demand improved farm technology. In 1793 Charles
Newbold, a New Jersey farmer, spent his entire fortune of $30,000 developing an efficient cast-iron plow. Farmers
refused to use it, fearing that iron would poison the soil and cause weeds to grow. Twenty years later, a Scipio, New
York, farmer named Jethro Wood patented an improved iron plow made out of interchangeable parts. Unlike wooden
plows, which required two men and four oxen to plow an acre in a day, Wood's cast-iron plow allowed one man and
one yoke of oxen to plow the same area. Demand was so great that manufacturers infringed on Wood's patents and
produced thousands of copies of this new plow yearly.

A shortage of farm labor encouraged many farmers to adopt labor-saving machinery. Prior to the introduction in
1803 of the cradle scythe--a rake used to cut and gather up grain and deposit it in even piles--a farmer could not
harvest more than half an acre a day. The horse rake--a device introduced in 1820 to mow hay--allowed a single
farmer to perform the work of eight to ten men. The invention in 1836 of a mechanical thresher, used to separate
the wheat from the chaff, helped to cut in half the man-hours required to produce an acre of wheat.
By 1830 the roots of America's future industrial growth had been firmly planted. Back in 1807, the nation had just 15
or 20 cotton mills, containing approximately 8,000 spindles. By 1831 the number of spindles in use totaled nearly a
million and a quarter. By 1830 Pittsburgh produced 100 steam engines a year; Cincinnati, 150. Factory production
had made household manufacture of shoes, clothing, textiles, and farm implement obsolete.
Page 71

Hollywood Today

In a 1992 bestseller, Hollywood vs. America, Michael Medved, co-host of public television's Sneak Previews,
described Hollywood as a "poison factory," befouling America's moral atmosphere and assaulting the country's "most
cherished values." Today's films, he argued, use their enormous capacity to influence opinion by glamorizing
violence, maligning marriage, mocking authority, promoting sexual promiscuity, ridiculing religion, and bombarding
viewers with an endless stream of profanity, gratuitous sex, and loutish forms of behavior. Where once the movies
offered sentiment, elegance, and romance, now, Medved contends, ideologically-motivated producers and directors
promote their own divisive agenda: anti-religion, anti-family, anti-military.

In fact, the picture is more complicated than Medved suggests. As film critic David Denby has observed,
abandonment of the Production Code in 1966 did indeed increase the amount of sex, violence, and profanity on the
screen; but particularly in the 1980s and '90s, Hollywood has also increased the amount of family entertainment it
offers, including feature-length cartoons like Aladdin and Beauty and the Beast; family comedies, like Honey I Shrunk
the Kids; and positive portrayals of the teaching profession, like Dead Poet's Society and Stand and Deliver. At the
same time that some films merely exploited history as a backdrop for action and adventure, like the Indiana Jones or
the Back to the Future trilogies, there has also been a revival of serious historical films like Glory and Malcolm X.
Meanwhile, independent directors released a growing number of idiosyncratic and inexpensive films, like The Crying
Game, while within Hollywood itself female movie makers, like Penny Marshall and Susan Seidelman, and African-
American film makers, like Spike Lee, have received unprecedented opportunity to bring fresh viewpoints to the
screen.

Nevertheless, as the movie industry enters its second century, many Americans worry about Hollywood's future.
Medved is not alone in complaining that "they don't make movies like they used to." A basic problem facing today's
Hollywood is the rapidly rising cost of making and marketing a movie: an average of $40 million today. The immense
cost of producing movies has led the studios to seek guaranteed hits: blockbuster loaded with high-tech special
effects, sequels, and remakes of earlier movies, foreign films, and even old TV shows.

Hollywood has also sought to cope with rising costs by focusing ever more intently on its core audiences. Since the
mid-1980s, the movie going audience has continued to decrease in size. Ticket sales fell from 1.2 billion in 1983 to
950 million in 1992, with the biggest drop occuring among adults. With the decline in the size of the adult audience,
the single largest group of movie-goers now consists of teenage boys, who are particularly attracted to thrills,
violence, and crude laughs.And since over half of Hollywood's profits are earned overseas, the industry has
concentrated much of its energy on crude action films easily understood by an international audience, featuring stars
like Arnold Schwartenegger and Sylvester Stallone.

For a century, the movie industry has been the nation's most important purveyor of culture and entertainment to the
masses, playing a critical role in the shift from Victorian to distinctively modern, consumer values; from a world of
words to a visual culture; from a society rooted in islands of localities and ethnic groups to a commercialized mass
culture. The movies taught Americans how to kiss, make love, conceive of gender roles, and understand their place
in the world. Whether film will continue to serve as the nation's preeminent instrument of cultural expression--
reflecting and also shaping values and cultural ideals--remains to be seen.
Page 72

Integrating the Armed Forces

Today, many Americans consider the U.S. Army the country's most successful effort at racial integration. Colin
Powell, now the country's first African American Secretary of State, has become a symbol of the Army's relative
openness. He rose through the Army's ranks to become the first black head of the Joint Chiefs of Staff.

Yet the integration of the armed forces is a relatively recent development. As recently as the end of 1950, when the
Korean War was entering its seventh month, African American troops were trained at a segregated facility at Fort
Dix, New Jersey, near New York City. Even later, in the fall of 1954, an all-African American unit, the 94th Engineer
Battalion, was stationed in Europe.

African Americans have participated actively in the country's wars. An African American minuteman, Prince
Easterbrooks, a slave, was wounded at the battle of Lexington, and, altogether, some 5,000 African Americans
fought for American independence during the Revolution despite British promises of freedom to any slaves who
defected to the Loyalist side.

It was not until the Civil War that African Americans were required to fight in racially separate units. In 1869,
Congress made racial separation in the military official government policy. This policy remained intact through the
Spanish American War, World War I (when two African American divisions participated in combat), and World War II.

It was during World War II that the policy of racial segregation within the military began to break down under
pressure from African American leaders, who pointed out the contradiction of a country fighting Nazi racism having a
segregated military. In March 1943, the War Department ordered the desegregation of recreational facilities at
military facilities. In mid-1944, the War Department ordered all buses to be operated in a non-discriminatory fashion.

Military necessity helped to shatter racial barriers. In December, 1944, 250,000 German troops launched a massive
counteroffensive, later known as the Battle of the Bulge, in Belgium. With only 80,000 Allied troops available in the
area to resist the German forces, black troops were invited to volunteer to fight alongside white troops. Some 2,500
African American troops volunteered. Although black and white troops served in separate platoons, this experience
helped the Army break with its usual practice of placing African American troops in separate units and assigning
them to non-combat duties.

In February 1948, President Harry S. Truman directed the U.S. armed forces to desegregate as quickly as possible.
In July, he issued Executive Order 9981 calling on the military to end racial discrimination. It would take several
years - and another war - before the military actually ended segregation. Three factors would ultimately lead to
integration: the growing recognition that segregation undercut the United States' moral stature during the Cold War;
the need to reduce racial tensions within the military; and the manpower needs produced by the Korean war.

Following President Truman's Executive Order, two boards were established to make recommendations about
integration. A presidential commission chaired by Charles Fahy recommended an end to discrimination in jobs,
schooling, assignment, and recruitment. An Army board headed by Lieutenant General S.J. Chamberlin called on the
Army to remain segregated and retain racial quotas. In the end, the Army agreed to open all jobs and military
training schools on a non-segregated basis. There were isolated examples at unit-level integration, including at Camp
Jackson, South Carolina in early 1951.

It was the Korean War that finally led to the desegregation of previously all-white combat units. After six months of
fighting, insufficient white replacement troops were available and black enlistments were high. In February 1951, the
Chamberlin board was asked to reexamine its conclusions. Although it acknowledged that integrated units had fewer
racial tensions than a combination of segregated units, it continued to call for a 10 percent Army quota of African
Americans. At this time, 98 percent of the Army's black soldiers served in segregated units. In May, General Matthew
Ridgway requested permission to desegregate his command.

In March, 1951, the Army asked Johns Hopkins University's Operations Research Office to analyze the impact of
integrating its forces. Extensive surveys of troops and analysis of combat performance in Korea revealed that:

Integration raised the morale of African American soldiers and did not reduce that of white soldiers;

Integration was favored by black soldiers and was not opposed by most white soldiers;
Experience in integrated units increased white support for integration; Integration improved fighting effectiveness.
An essential finding is that integration reduced racial tensions within the military. In December 1951 the Chief of
Staff ordered all Army commands to desegregate.
Page 73

Overview of American Becomes a World Power

During the 1890s, the United States showed little interest in foreign affairs. Its army, with just 28,000 soldiers,
was one-twentieth the size of France's or Germany's. Its 10,000-man navy was a sixth the size of Britain's and
half the size of Spain's.

Toward the end of the 19th century, interest in foreign affairs mounted. Some worried that the United States
was being left behind in the scramble for territory, markets, raw materials, and outlets for investment. Others,
such as the naval strategist Alfred Thayer Mahan, believed that national prosperity depended on control of sea
lanes. Still others believed that the United States had a special mission to uplift backwards peoples.

Beginning in the late 1880s, a new assertiveness characterized American foreign policy, evident in disputes
with Germany, Chile, and Britain. In 1893, Americans in Hawaii forced Queen Liliuokalani to abdicate; the
United States annexed Hawaii five years later. War with Spain in 1898 led to the acquisition of Puerto Rico,
Guam, and the Philippines, where the United States confronted a two-year insurrection.

Fear that the United States was being shut out of trade with China led Secretary of State John Hay to issue the
1899 Open Door Note. The Roosevelt Corollary to the Monroe Doctrine declared that the United States would
exercise “international police power” in the Western Hemisphere. The United States assisted Panama in
securing its independence from Columbia in order to build a canal across the Isthmus of Panama. The U.S.
occupied Nicaragua for 20 years, Haiti for 19 years, and the Dominican Republic for 8 years.

Summary:

At the turn of the 20th century, the United States became a world power. In 1898 and 1899, the United States
annexed Hawaii and acquired the Philippines, Puerto Rico, parts of the Samoan islands, and other Pacific
islands. Expansion raised the fateful question of whether the newly annexed peoples would receive the rights
of American citizens.

The Spanish American War and the acquisition of the Philippines represented both an extension of earlier
expansionist impulses and a sharp departure from assumptions that had guided American foreign policy in the
past. For the first time, the United States made a major strategic commitment in the Far East, acquired
territory never intended for statehood, and committed itself to police actions and intervention in the Caribbean
and Central America.
Page 74

Overview of the Great Depression

The Great Depression was steeper and more protracted in the United States than in other industrialized
countries. The unemployment rate rose higher and remained higher longer than in any other western country.
As it deepened, the Depression had far-reaching political consequences. The Depression vastly expanded the
scope and scale of the federal government and created the modern welfare state. It gave rise to a philosophy
that the federal government should provide a safety net for the elderly, the jobless, the disabled, and the poor,
and that the federal government was responsible for ensuring the health of the nation's economy and the
welfare of its citizens.

The stock market crash of October 1929 brought the economic prosperity of the 1920s to a symbolic end. For
the next ten years, the United States was mired in a deep economic depression. By 1933, unemployment had
soared to 25 percent, up from 3.2 percent in 1929. Industrial production declined by 50 percent, international
trade plunged 30 percent, and investment fell 98 percent.

Causes of the Depression

Causes of the Great Depression included: insufficient purchasing power among the middle class and the
working class to sustain high levels of production; falling crop and commodity prices prior to the Depression;
the stock market's dependence on borrowed money; and wrongheaded government policies, including high
tariffs that reduced international trade and contracted the money supply.

Political and Social Consequences

The Great Depression transformed the American political and economic landscape. It produced a major political
realignment, creating a coalition of big-city ethnics, African Americans, organized labor, and Southern
Democrats committed, to varying degrees, to interventionist government. It strengthened the federal presence
in American life, spawning such innovations as national old-age pensions, unemployment compensation, aid to
dependent children, public housing, federally-subsidized school lunches, insured bank depositions, the
minimum wage, and stock market regulation. It fundamentally altered labor relations, producing a revived
labor movement and a national labor policy protective of collective bargaining. It transformed the farm
economy by introducing federal price supports. Above all, it led Americans to view the federal government as
an agency of action and reform and the ultimate protector of public well-being.

The Great Depression and American Culture

The Great Depression challenged certain basic precepts of American culture, especially the faith in individual
self-help, business, the inevitability of progress, and limited government. The Depression encouraged a search
for the real America. There was a new interest in “the people,” in regional cultures, and in folk traditions. The
movies played a crucial role in sustaining American ideals in a time of social upheaval across Europe. Films
projected images of a world in which financial success was possible and of a society in which class barriers
could be overcome.

This section examines why the seemingly boundless prosperity of the 1920s ended so suddenly and why the
Depression lasted as long as it did. It assesses the Depression's human toll and the policies adopted to combat
the crisis. It devotes particular attention to the Depression's impact on African Americans, the elderly, Mexican
Americans, labor, and women. In addition to assessing the ideas that informed the New Deal policies, this
chapter examines the New Deal’s critics and evaluates the New Deal's impact.
Page 75

Overview of the Vietnam War

Vietnam was the longest war in American history and the most unpopular American war of the 20th century. It
resulted in nearly 60,000 American deaths and in an estimated 2 million Vietnamese deaths. Even today, many
Americans still ask whether the American effort in Vietnam was a sin, a blunder, a necessary war, or whether it
was a noble cause, or an idealistic, if failed, effort to protect the South Vietnamese from totalitarian
government.

Summary:

Between 1945 and 1954, the Vietnamese waged an anti-colonial war against France, which received $2.6 billion
in financial support from the United States. The French defeat at the Dien Bien Phu was followed by a peace
conference in Geneva. As a result of the conference, Laos, Cambodia, and Vietnam received their
independence, and Vietnam was temporarily divided between an anti-Communist South and a Communist
North. In 1956, South Vietnam, with American backing, refused to hold unification elections. By 1958,
Communist-led guerrillas, known as the Viet Cong, had begun to battle the South Vietnamese government.

To support the South's government, the United States sent in 2,000 military advisors--a number that grew to
16,300 in 1963. The military condition deteriorated, and by 1963, South Vietnam had lost the fertile Mekong
Delta to the Viet Cong. In 1965, President Lyndon Johnson escalated the war, commencing air strikes on North
Vietnam and committing ground forces--which numbered 536,000 in 1968. The 1968 Tet Offensive by the
North Vietnamese turned many Americans against the war.

The next president, Richard Nixon, advocated Vietnamization, withdrawing American troops and giving South
Vietnam greater responsibility for fighting the war. In 1970, Nixon attempted to slow the flow of North
Vietnamese soldiers and supplies into South Vietnam by sending American forces to destroy Communist supply
bases in Cambodia. This act violated Cambodian neutrality and provoked antiwar protests on the nation's
college campuses.

From 1968 to 1973, efforts were made to end the conflict through diplomacy. In January 1973, an agreement
was reached; U.S. forces were withdrawn from Vietnam, and U.S. prisoners of war were released. In April
1975, South Vietnam surrendered to the North, and Vietnam was reunited.

Consequences

1. The Vietnam War cost the United States 58,000 lives and 350,000 casualties. It also resulted in between
one and two million Vietnamese deaths.

2. Congress enacted the War Powers Act in 1973, requiring the president to receive explicit Congressional
approval before committing American forces overseas.
Page 76

Overview of World War I

The Associated Press ranked World War I as the 8th most important event of the 20th century. In fact, almost
everything that subsequently happened occurred because of World War I: the Great Depression, World War II,
the Holocaust, the Cold War, and the collapse of empires. No event better underscores the utter
unpredictability of the future. Europe hadn't fought a major war for 100 years. A product of miscalculation,
misunderstanding, and miscommunication, the conflict might have been averted at many points during the five
weeks preceding the fighting.

World War I destroyed four empires - German, Austro-Hungarian, Ottoman, and Romanov - and touched off
colonial revolts in the Middle East and Vietnam. WWI shattered Americans' faith in reform and moral crusades.
WWI carried far-reaching consequences for the home front, including prohibition, women's suffrage, and a
bitter debate over civil liberties.

World War I killed more people (9 million combatants and 5 million civilians) and cost more money ($186
billion in direct costs and another $151 billion in indirect costs) than any previous war in history.

Triggered by the assassination of Archduke Franz Ferdinand, the heir to the throne of the Austro-Hungarian
Empire, World War I began in August 1914 when Germany invaded Belgium and France. Several events led to
U.S. intervention: the sinking of the Lusitania, a British passenger liner; unrestricted German submarine
warfare; and the Zimmerman note, which revealed a German plot to provoke Mexico to war against the United
States. Millions of American men were drafted, and Congress created a War Industries Board to coordinate
production and a National War Labor Board to unify labor policy. The Treaty of Versailles deprived Germany of
territory and forced it to pay reparations. President Wilson agreed to the treaty because it provided for the
establishment of a League of Nations, but he was unable to persuade the Senate to ratify the treaty.

Consequences:

1. Nearly 10 million soldiers died and about 21 million were wounded. U.S. deaths totaled 116,516.

2. Four empires collapsed: the Russian Empire in 1917, the German and the Austro-Hungarian in 1918, and
the Ottoman in 1922.

3. Independent republics were formed in Austria, Czechoslovakia, Estonia, Hungary, Latvia, Lithuania, and
Turkey.

4. Most Arab lands that had been part of the Ottoman Empire came under the control of Britain and France.

5. The Bolsheviks took power in Russia in 1917, and fascists triumphed in Italy in 1922.

6. Other consequences of the war included the mass murder of Armenians in Turkey and an influenza epidemic
that killed over 25 million people worldwide.

7. Under the peace settlement, Germany was required to pay reparations eventually set at $33 billion; accept
responsibility for the war; cede territory to Belgium, Czechoslovakia, Denmark, France, and Poland; give up its
overseas colonies; and accept an allied military force on the west bank of the Rhine River for 15 years.
Page 77

Overview of World War II

World War II killed more people, involved more nations, and cost more money than any other war in history.
Altogether, 70 million people served in the armed forces during the war, and 17 million combatants died.
Civilian deaths were ever greater. At least 19 million Soviet civilians, 10 million Chinese, and 6 million European
Jews lost their lives during the war.

World War II was truly a global war. Some 70 nations took part in the conflict, and fighting took place on the
continents of Africa, Asia, and Europe, as well as on the high seas. Entire societies participated as soldiers or as
war workers, while others were persecuted as victims of occupation and mass murder.

World War II cost the United States a million causalities and nearly 400,000 deaths. In both domestic and
foreign affairs, its consequences were far-reaching. It ended the Depression, brought millions of married
women into the workforce, initiated sweeping changes in the lives of the nation's minority groups, and
dramatically expanded government's presence in American life.

The War at Home & Abroad

On September 1, 1939, World War II started when Germany invaded Poland. By November 1942, the Axis
powers controlled territory from Norway to North Africa and from France to the Soviet Union. After defeating
the Axis in North Africa in May 1941, the United States and its Allies invaded Sicily in July 1943 and forced Italy
to surrender in September. On D-Day, June 6, 1944, the Allies landed in Northern France. In December, a
German counteroffensive (the Battle of the Bulge) failed. Germany surrendered in May 1945.

The United States entered the war following a surprise attack by Japan on the U.S. Pacific fleet in Hawaii. The
United States and its Allies halted Japanese expansion at the Battle of Midway in June 1942 and in other
campaigns in the South Pacific. From 1943 to August 1945, the Allies hopped from island to island across the
Central Pacific and also battled the Japanese in China, Burma, and India. Japan agreed to surrender on August
14, 1945 after the United States dropped the first atomic bombs on the Japanese cities of Hiroshima and
Nagasaki.

Consequences:

1. The war ended Depression unemployment and dramatically expanded government's presence in American
life. It led the federal government to create a War Production Board to oversee conversion to a wartime
economy and the Office of Price Administration to set prices on many items and to supervise a rationing
system.

2. During the war, African Americans, women, and Mexican Americans founded new opportunities in industry.
But Japanese Americans living on the Pacific coast were relocated from their homes and placed in internment
camps.

The Dawn of the Atomic Age

In 1939, Albert Einstein wrote a letter to President Roosevelt, warning him that the Nazis might be able to
build an atomic bomb. On December 2, 1942, Enrico Fermi, an Italian refugee, produced the first self-
sustained, controlled nuclear chain reaction in Chicago.

To ensure that the United States developed a bomb before Nazi Germany did, the federal government started
the secret $2 billion Manhattan Project. On July 16, 1945, in the New Mexico desert near Alamogordo, the
Manhattan Project's scientists exploded the first atomic bomb.
It was during the Potsdam negotiations that President Harry Truman learned that American scientists had
tested the first atomic bomb. On August 6, 1945, the Enola Gay, a B-29 Superfortress, released an atomic
bomb over Hiroshima, Japan. Between 80,000 and 140,000 people were killed or fatally wounded. Three days
later, a second bomb fell on Nagasaki. About 35,000 people were killed. The following day Japan sued for
peace.

President Truman's defenders argued that the bombs ended the war quickly, avoiding the necessity of a costly
invasion and the probable loss of tens of thousands of American lives and hundreds of thousands of Japanese
lives. His critics argued that the war might have ended even without the atomic bombings. They maintained
that the Japanese economy would have been strangled by a continued naval blockade, and that Japan could
have been forced to surrender by conventional firebombing or by a demonstration of the atomic bomb's power.
Page 78

POLITICAL PARTIES & ELECTIONS

The Federalist Party was the first American political party and existed from the early 1790s to
1816. The party was run by Alexander Hamilton, who was Secretary of the Treasury and chief
architect of George Washington's administration. The Federalists called for a strong national
government that promoted economic growth. The Democratic-Republican Party was an
American political party formed by Thomas Jefferson and James Madison in 1791–1793 to
oppose the centralising policies of the new Federalist Party.

Although these parties were soon succeeded by others, there remains to this day the basic
political cleavage between those who want to see an activist central government and those
who want to limit the power of the central government - now represented broadly by the
Democratic Party and the Republican Party respectively.

To an extent quite extraordinary in democratic countries, the American political system is


dominated by these two political parties: the Democratic Party and the Republican Party (often
known as the 'Grand Old Party' or GOP). These are very old and very stable parties - the
Democrats go back to 1824 and the Republicans were founded in 1854.

In illustrations and promotional material, the Democratic Party is often represented as a


donkey, while the Republican Party is featured as an elephant. The origin of these symbols is
the political cartoonist Thomas Nast who came up with them in 1870 and 1874 respectively.

The main reason for the dominance of these two parties is that - like most other Anglo-Saxon
countries (notably Britain) - the electoral system is 'first past the post' or simple majority
which, combined with the large voter size of the constituencies in the House and (even more)
the Senate, ensures that effectively only two parties can play. The other key factor is the huge
influence of money in the American electoral system. Since effectively a candidate can spend
any amount he can raise (not allowed in many other countries) and since one can buy
broadcasting time (again not allowed in many countries), the US can only 'afford' two parties
or, to put it another way, candidates of any other party face a formidable financial barrier to
entry.

Some people tend to view the division between the Democratic Party and the Republican Party
in the United States as the same as that between Labour and Conservative in Britain or
between Social Democrats and Christian Democrats in Germany. The comparison is valid in the
sense that, in each country, one political party is characterised as Centre-Left and the other as
Centre-Right or, to put it another way, one party is more economically interventionist and
socially radical than the other. However, the analogy has many weaknesses.

1. The Centre in American politics is considerably to the Right of the Centre in most
European states including Britain, Germany, France, Italy and (even more especially) the
Scandinavian countries. So, for instance, most members of the Conservative Party in the
UK would support a national health service, whereas many members of the Democratic
Party in the US would not.

2. As a consequence of the enormous geographical size of the United States and the
different histories of the different states (exemplified by the Civil War), geography is a
factor in ideological positioning to a much greater extent than in other democratic
countries. For instance, a Northern Republican could be more liberal than a Southern
Democract. Conversely there is a group of Democratic Congressmen that are fiscally very
conservative - they are known as "blue dog" Democrats or even DINO (Democrats In
Page 79

Name Only).

3. In the United States, divisions over social matters - such as abortion, capital punishment,
same-sex relationships and stem cell research - matter and follow party lines in a way
which is not true of most European countries. In Britain, for instance, these sort of issues
would be regarded as matters of personal conscience and would not feature prominently
in election debates between candidates and parties.

4. In the USA, religion is a factor in politics in a way unique in western democracies.


Candidates openly proclaim their faith in a manner which would be regarded as bizarre
elswhere (even in a Catholic country like France) and religious groupings - such as the
Christian Coalition of America [click here] - exert a significiant political influence in a
manner which would be regarded as improper in most European countries (Poland is an
exception here).

5. In the United States, the 'whipping system' - that is the instructions to members of the
House and the Senate on how to vote - is not as strict or effective as it is in most
European countries. As a consequence, members of Congress are less constrained by
party affiliation and freer to act individually.

6. In the USA, political parties are much weaker institutions than they are in other
democracies. Between the selection of candidates, they are less active than their
counterparts in other countries and, during elections, they are less influential in
campaigning, with individual politicians and their campaigns having much more influence.

7. The cost of elections is much greater in the US than in other democracies which has the
effects of limiting the range of candidates, increasing the influence of corporate interests
and pressure groups, and enhancing the position of the incumbent office holder
(especially in the winning of primaries). As long ago as 1895, the Chairman of the
Republican National Committee Mark Hanna stated: "There are two things that are
important in politics. The first is money, and I can't remember what the second one is."

8. Whereas in other countries, voters shape the policies and select the candidates of a party
by joining it, in the USA voters register as a supporter of one of the major parties and
then vote in primary elections to determine who should be the party's candidate in the
'real' election.

One other oddity of the American party system is that, whereas in most countries of the world
the colour red is associated with the Left-wing party and the colour blue with the Right-wing
party, in the United States the reverse is the case. So the 'blue states' are those traditionally
won by the Democrats, while the 'red states' are those normally controlled by the Republicans.

Two interesting features of American political elections are low turnout and the importance of
incumbency.

Traditionally turnout in US congressional elections is much lower than in other liberal


democracies especially those of Western Europe. When there is a presidential election, turnout
is only about half; when there is no presidential election, turnout is merely about one third.
The exception was the elections of 2008: the excitement of the candidacy of Barack Obama led
to an unusually high turnout of 63%, the highest since 1960 (the election of John F Kennedy).

While Congress as an institution is held in popular contempt, voters like their member of


Page 80

Congress and indeed there is a phenomenon known as 'sophomore surge' whereby incumbents
tend to increase their share of the vote when they seek re-election. More generally most
incumbents win re-election for several reasons: they allocate time and resources to waging a
permanent re-election campaign; they can win "earmarks" which are appropriations of
government spending for projects in the constituency; and they find it easier than challengers
to raise money for election campaigns.
Page 81

President Truman: Using Atomic Bombs against Japan, 1945

Every American president makes decisions with enormous repercussions for the future. Some of these decisions
prove successful; others turn out to be blunders. In virtually every case, presidents must act with contradictory
advice and limited information. At 8:15 a.m., August 6, 1945, an American B-29 released an atomic bomb over
Hiroshima, Japan. Within minutes, Japan’s eighth largest city was destroyed. By the end of the year, 140,000 people
had died from the bomb’s effects. After the bombing was completed, the United States announced that Japan faced a
rain of ruin from the air, the like of which had never been seen on this earth." Background: In 1939, Albert Einstein,
writing on behalf physicist Leo Szilard and other leading physicists, informed President Franklin D. Roosevelt that Nazi
Germany was carrying on experiments in the use of atomic weapons. In October, 1939, the federal government
began a modest research program which and later became the two-billion-dollar Manhattan Project. Its purpose was
to produce an atomic bomb before the Germans. On December 2, 1942, scientists in Chicago succeeded in starting a
nuclear chain reaction, demonstrating the possibility of unleashing atomic power.
It was not until April 25, 1945, 13 days after the death of Franklin Roosevelt, that the new president, Harry S.
Truman, was briefed about the Manhattan Project. Secretary of War Henry Stimson informed him that "within four
months we shall in all probability have completed the most terrible weapon ever known in human history."

Stimson proposed that a special committee be set up to consider whether the atomic bomb would be used, and if so,
when and where it would be deployed. Members of this panel, known as the Interim Committee, which Stimson
chaired, included George L. Harrison, President of the New York Life Insurance Company and special consultant in
the Secretary's office; James F. Byrnes, President Truman's personal representative; Ralph A. Bard, Under Secretary
of the Navy; William L. Clayton, Assistant Secretary of State; and scientific advisers Vannevar Bush, Karl T.
Compton, and James B. Conant. General George Marshall and Manhattan Project Director Leslie Groves also
participated in some of the committee’s meetings. On June 1, 1945, the Interim Committee recommended that that
atomic bombs should be dropped on military targets in Japan as soon as possible and without warning. One
committee member, Ralph Bard, convinced that Japan may be seeking a way to end the war, called for a two to
three day warning before the bomb was dropped.

A group of scientists involved in the Manhattan project opposed the use of the atomic bomb as a military weapon. In
a report signed by physicist James Franck, they called for a public demonstration of the weapon in a desert or on a
barren island. On June 16, 1945, a scientific panel consisting of physicists Arthur H. Compton, Enrico Fermi, E. O.
Lawrence, and J. Robert Oppenheimer reported that it did not believe that a technical demonstration would be
sufficient to end the war.

Meanwhile, after a June 18, 1945, meeting with his military advisors, President Truman approved a plan that called
for an initial invasion of Japan on November 1. By the summer of 1945, a growing number of government policy
makers were concerned about Soviet intervention in the war against Japan. Already, the Red Army occupied
Rumania, Bulgaria, Yugoslavia, Czechoslovakia, and Hungary. At the Potsdam conference in July, Soviet Premier
Joseph Stalin informed President Truman and British Prime Minister Winston Churchill that the Red Army would enter
the war against Japan around August 8. On July 25, 1945, the military was authorized to drop the bomb as soon as
weather will permit visual bombing after about 3 August 1945 on one of the targets: Hiroshima, Kokura, Niigata and
Nagasaki." The bombing of Hiroshima, followed by the Soviet declaration of war against Japan on August 9th and the
bombing of Nagasaki the same day, led the Japanese leadership to accept a surrender so long as their country could
retain the emperor. Variety of points of view:
1. Ralph Bard, Under Secretary of the Navy: Ever since I have been in touch with this program I have had a
feeling that before the bomb is actually used against Japan that Japan should have some preliminary warning
for say two or three days in advance of use. The position of the United States as a great humanitarian nation
and the fair play attitude of our people generally is responsible in the main for this feeling.
2. James Byrnes: [Physicist Leo Szilard wrote:] "[Byrnes] was concerned about Russia's postwar behavior.
Russian troops had moved into Hungary and Rumania, and Byrnes thought it would be very difficult to
persuade Russia to withdraw her troops from these countries, that Russia might be more manageable if
impressed by American military might, and that a demonstration of the bomb might impress Russia."
3. General Dwight D. Eisenhower: "In 1945 ... , Secretary of War Stimson visited my headquarters in Germany,
[and] informed me that our government was preparing to drop an atomic bomb on Japan. I was one of those
who felt that there were a number of cogent reasons to question the wisdom of such an act.... During his
recitation of the relevant facts, I had been conscious of a feeling of depression and so I voiced to him my grave
misgivings, first on the basis of my belief that Japan was already defeated and that dropping the bomb was
completely unnecessary, and second because I thought that our country should avoid shocking world opinion
by the use of a weapon whose employment was, I thought, no longer mandatory as a measure to save
American lives. It was my belief that Japan was, at that very moment, seeking some way to surrender with a
minimum lo
Page 82

September 11, 2001

Osama bin Laden was born in 1957 to a Yemeni bricklayer. He was one of the youngest of nearly fifty children. Bin
Laden grew up in Saudi Arabia, where his father founded a construction firm that would become the largest in the
desert kingdom. He inherited millions of dollars after his father’s death and graduated from one of the kingdom’s
leading universities with a degree in civil engineering.

In 1979, bin Laden left Saudi Arabia to assist Muslims in Afghanistan in expelling the Soviet army, which was trying
to support a communist government in the country by raising money and recruits. During the mid-1980s, bin Laden
built roads, tunnels, and bunkers in Afghanistan.

Although the U.S. had helped him and his fellow warriors expel the Soviets from Afghanistan, bin Laden would turn
against the United States. He was furious about the deployment of American troops in Saudi Arabia--the birthplace of
the Prophet Muhammad and home of the two holiest Muslim shrines--that had been sent to protect the oil-rich
kingdom from an Iraqi invasion. He was also angry about U.S. support for Israel and the American role in enforcing
an economic embargo against Iraq. His goal was to remove American forces from his Saudi homeland, destroy the
Jewish state in Israel, and defeat pro-Western dictatorships around the Middle East.

By 1998, bin Laden had formed a terrorist network called Al-Qaeda, which in Arabic means “the base.” He also
provided training camps, financing, planning, recruitment, and other support services for fighters seeking to strike at
the United States.

American officials believe bin Laden's associates operate in over 40 countries--in Europe and North America, as well
as in the Middle East and Asia. U.S. government officials believe bin Laden was involved in at least four major
terrorist attacks against the United States’ interests prior to the September 11, 2001 attack: the 1993 World Trade
Center bombing; the 1996 killing of 19 U.S. soldiers in Saudi Arabia; the 1998 bombings of U.S. embassies in Kenya
and Tanzania; and the 2000 attack on the USS Cole at a port in Yemen, in which 17 U.S. sailors were killed.

Al-Qaeda viewed the U.S. responses to these attacks as half-hearted. In 1998, in retaliation for the bombings of the
U.S. embassies in Africa, American cruise missiles struck a network of terrorist compounds in Afghanistan and a
pharmaceutical plant in Sudan. The pharmaceutical plant target was mistakenly believed to have been producing
chemicals for use in nerve gas.

The September 11th Attacks

On September 11th, hijackers turned commercial airlines into missiles and attacked key symbols of American
economic and military might. These hideous attacks leveled the World Trade Center towers in New York, destroyed
part of the Pentagon, and left Americans in a mood similar to that which the country experienced after the
devastating Japanese attack on the American fleet at Pearl Harbor in 1941.

The succession of horrors began at 8:45 a.m., when American Airlines Flight 11, carrying 92 people from Boston to
Los Angeles, crashed into the World Trade Center's north tower. Eighteen minutes later, United Airlines Flight 175,
carrying 65 people, also bound for Los Angeles from Boston, struck the World Trade Center's south tower. At 9:40
a.m., American Airlines Flight 77, flying from Washington, D.C., to Los Angeles and carrying 64 people aboard,
crashed into the Pentagon. At 10 a.m., United Airlines Flight 93, flying from Newark, N.J., to San Francisco, crashed
80 miles southeast of Pittsburgh. Passengers onboard the airliner, having heard about the attacks on New York and
Washington, D.C., apparently stormed the airplane’s cockpit and prevented the hijackers from attacking the nation’s
capital.

Millions of television viewers watched in utter horror. At 9:50 a.m., the World Trade Center's south tower collapsed.
At 10:29 a.m., the World Trade Center's north tower also collapsed.

More than 3,000 innocent civilians and rescue workers perished as a result of these acts of terror. This was about the
same number of Americans who died on June 6, 1944, during the D-Day invasion of Nazi-occupied France. This was
nearly as many as the 3,620 American--the largest number of Americans to die in combat on a single day--who died
at the Civil War battle of Antietam on September 17, 1862. More Americans died in two hours on September 11th
than died in the War of 1812, the Spanish American War, or the Gulf War.
Page 83
The U.S. Response

The U.S. response to the September 11th attacks was immediate and forceful. Over a period of just three days,
Congress voted to spend $40 billion for recovery. Then, like his father in the period before the Persian Gulf War,
George W. Bush organized an international coalition against Al-Qaeda and the Taliban government in Afghanistan
that supported it. He persuaded Pakistan, which had been the main sponsor of Afghanistan’s Taliban government, to
support the United States diplomatically and logistically.
On October 7, 2001, in retaliation for the September 11th attacks, a U.S.-led coalition launched an attack against
targets in Afghanistan--the beginning of what President Bush has promised would be a long campaign against
terrorist groups and the states that support them. The American strategy in Afghanistan involved using American air
power and ground targeting to support the Northern Alliance, the major indigenous force opposing the Taliban. Later,
U.S. and British forces coordinated ground operations against Al-Qaeda and the Taliban.
Afghanistan's rugged terrain, extremes of weather extremes, and veteran guerilla-style fighters presented a serious
challenge to the American military. But the effective use of laser-guided missiles, cluster bombs, 2,000-pound Daisy
Cutter bombs, unmanned drones, and U.S. and British Special Forces, in conjunction with indigenous Afghani forces,
succeeded in overthrowing the Taliban government. However, some members of Al-Qaeda and the Taliban
apparently escaped into isolated regions along the Afghanistan-Pakistan border. Between 1,000 and 1,300 Afghani
civilians were killed.
Civil Liberties and National Security: Trying to Strike a Balance

The war on terror has forced the nation to toughen its national security. Following the horrifying events of September
11, 2001, more than 1,000 people, mainly Arab and Muslim men suspected of having information about terrorism,
were detained by the federal government. These detainees were held without charges, and their names and
whereabouts were largely kept secret.

In the wake of the September 11th attacks, Congress enacted legislation that gave law enforcement agencies broader
authority to wiretap suspects and to monitor online communication. Congress also expanded the government’s
authority to detain or deport aliens who associate with members of terrorist organizations. It also authorized greater
intelligence sharing among the FBI, the CIA, the Immigration and Naturalization Service, and local law enforcement
agencies.

President Bush responded to the attacks by proposing a cabinet-level Department of Homeland Security. Homeland
Security would help to prevent terrorist attacks within the United States, reduce the country's vulnerability to
terrorism, and minimize the damage and recovery from attacks that do occur. The new department would be
responsible for promoting border security, responding to chemical, biological, and radiological attacks, and utilizing
information analysis.
Arab Americans and Muslim Americans

In the immediate aftermath of the September 11th attacks, some Americans directed their anger at Arab Americans,
Muslims, and South Asians. In a suburb in Phoenix, Arizona, an Indian immigrant who practiced the Sikh faith was
murdered in a hate crime. So, too, was a Pakistani grocer in Dallas, Texas. In Irving, Texas, bullets were fired into an
Islamic community center. Some 300 protestors tried to storm a Chicago-area mosque. Near Detroit, Michigan, an
Islamic school had to close down because of daily bomb threats.

”Those who directed their anger against Arab Americans and Muslims should be ashamed,” President Bush declared.
"Muslim Americans make an incredibly valuable contribution to our country," he said. "They need to be treated with
respect." Today, there are approximately 3 million Arab Americans in the United States. About a third live in
California, Michigan, and New York.

Arab Americans belong to many different religions. While most are Muslims, many are Catholics, Orthodox Christians,
Jews, or Druze. Prominent political figures of Arab descent include Ralph Nader, former Senate Majority Leader
George Mitchell, and former Secretary of Health and Human Services Donna Shalala.
According to a poll conducted by the Pew Memorial Trusts, approximately two-fifths of the nation’s approximately 7
million Muslim Americans were born in the United States, with the rest coming from 80 other countries. About 32
percent are South Asian, 26 percent are Arab, 20 percent African American, 7 percent African, and 14 percent report
some other background. About a fifth are converts to Islam.
The Meaning of September 11th
The September 11th attacks dramatically altered the way the United States looked at itself and the world. The
attacks produced a surge of patriotism and national unity and pride. However, the terrorist strikes also fostered a
new sense of vulnerability.
Page 84

The American Renaissance

The 1840s and 1850s witnessed an extraordinary outpouring of literary creativity as American writers abandoned
their subservience to foreign models and created a distinctly American literature.

During his lifetime, Edgar Allan Poe (1809–1849) received far more notoriety from his legendary dissipation than
from his poetry or short stories. Poe was raised by a Richmond, Virginia, merchant after his father abandoned the
family and his mother died. For two years he went to the University of Virginia and briefly attended West Point, but
drinking, gambling debts, and bitter fights with his guardian cut short his formal education. At the age of 24, he
married a 13-year-old second cousin, who died a decade later of tuberculosis, brought on by cold and starvation.
Found drunk and unconscious in Baltimore in 1849, Poe died at the age of 40.

Sorely underappreciated by contemporaries, Poe invented the detective novel; edited the Southern Literary
Messenger, one of the country’s leading literary journals; wrote incisive essays on literary criticism; and produced
some of the most masterful poems and frightening tales of horror ever written. His literary techniques inspired a
number of important French writers, including Charles Baudelaire, Stéphane Mallarmé, and Paul Valéry. Poe said that
his writing style consisted of “the ludicrous heightened into the grotesque; the fearful colored into the horrible; the
witty exaggerated into the burlesque; the singular wrought into the strange and mystical.”

Nathaniel Hawthorne (1804–1864), the author of The Scarlet Letter (1850), one of America’s towering works of
fiction, did not consider himself a novelist. He wrote “romances,” he insisted--imaginative representations of moral
problems, rather than novelistic depictions of social realities. A descendant of one of the Salem witch-trial judges,
the Salem-born Hawthorne grew up in a somber and solitary atmosphere. His father, a sea captain, perished on a
voyage when his son was just 4 years old, and Hawthorne’s mother spent the remainder of her life in mourning.
After attending Bowdoin College, where Henry Wadsworth Longfellow and future president Franklin Pierce were
among his classmates, he began to write. It would not be until 1837, however, when he published Twice-Told Tales,
that the 33-year-old Hawthorne first gained public recognition. He lived briefly at Brook Farm and participated in the
transcendentalist circle, but did not share their idealistic faith in humanity’s innate goodness.

Hawthorne was a secretive, painfully shy man. But no pre–Civil War author wrote more perceptively about guilt--
sexual, moral, and psychological. “In the depths of every human heart,” he wrote in an early tale, “there is a tomb
and a dungeon, though the lights, the music, the revelry above may cause us to forget their existence, and the
buried ones, or prisoners whom they hide.” In his fiction, Hawthorne, more than any other early 19th century
American writer, challenged the larger society’s faith in science, technology, progress, and humanity’s essential
goodness. Many of his greatest works project 19th century concerns--about women’s roles, sexuality, and religion--
onto 17th century Puritan settings. Some of his stories examine the hubris of scientists and social reformers who
dare tamper with the natural environment and human nature.

Herman Melville (1819–1891), author of Moby Dick (1851), possibly America’s greatest romance, had little formal
education and claimed that his intellectual development did not begin until he was 25. By then, he had already seen
his father go bankrupt and die insane, worked as a cabin boy on a merchant ship, served as a common seaman on a
whaling ship, deserted in the Marquessa Islands, escaped on an Australian whaler, and been imprisoned in Tahiti. He
drew on these experiences in his first two books, Typee (1846) and Omoo (1847), which were popular successes, but
his third book Mardi (1849), a complex blend of political and religious allegory, metaphysics, and cosmic romance,
failed miserably, foreshadowing the reception of his later works. Part of a New York literary circle called Young
America, Melville dreamed of creating a novel as vast and energetic as the nation itself. In Moby Dick, he produced
such a masterwork. Based on the tale of “Mocha-Dick,” a gigantic white whale that sank a whaling ship, Moby Dick
combined whaling lore and sea adventure into an epic drama of human arrogance, producing an allegory that
explores what happens to a people who defy divine limits. Tragically, neither Moby Dick nor Melville’s later works
found an audience, and Melville spent his last years as a deputy customs collector in New York. He died in utter
obscurity, and his literary genius was rediscovered only in the 1920s.

In 1842, Ralph Waldo Emerson lectured in New York and called for a truly original American poet who could fashion
verse out of “the factory, the railroad, and the wharf.” Sitting in Emerson’s audience was a 22-year-old New York
printer and journalist named Walt Whitman (1819–1892). A carpenter’s son with only five years of schooling,
Whitman soon became Emerson’s ideal of the Native American poet, with the publication of Leaves of Grass in 1855.
“A mixture of Yankee transcendentalism and New York rowdyism,” Leaves of Grass was, wrote Emerson, “the most
extraordinary piece of wit & wisdom that America has yet contributed.” Most reviewers, however, reacted scornfully
to the book, deeming it “trashy, profane & obscene” for its sexual frankness.
Page 85

The Bill of Rights

Seven states had bills of rights protecting fundamental freedoms from government infringement. Among the rights
that were guaranteed were freedom of the press, of speech, and of religion, and the right to a jury trial.

The Constitutional Convention did include specific protections in the Constitution. Article VI restricted government
interference with religion and speech. It also provided certain protections in criminal law. It guaranteed that the writ
of habeas corpus (a protection against illegal imprisonment) "shall not be suspended" except in times of rebellion or
invasion. It also prohibited bills of attainder (imposing punishment on a person's descendants) and ex post facto laws
(laws that punish behavior that took place before their enactment. It also forbade any state to pass laws "impairing
the obligation of contracts."

George Mason, the main author of Virginia's 1776 Declaration of Rights, wished that the Constitution "had been
prefaced with a bill of rights." But James Madison felt a bill of rights was unnecessary and superfluous. He feared that
by specifying certain rights for protection might suggest that other rights might be tampered with. He also worried
that such protections would be insufficient "on those occasions when control is most needed."

But pressure for a Bill of Rights was intense. Thomas Jefferson wrote Madison: "...a bill of rights is what the people
are entitled to against every government on earth, general or particular, and what no government should refuse or
rest on inference." During the ratification debates, the Constitution's supporters agreed to adopt a Bill of Rights.

State ratification conventions proposed more than two hundred proposed amendments. From these, Madison distilled
19 possible amendments. Congress accepted 12 of the Amendments and the states approved 10. One of the rejected
amendments dealt with the size of the House of Representatives. The other amendment prevents Congress from
increasing its salary. Salary changes cannot take effect until after the next congressional election. This amendment
was ratified in 1992.

During the 19th century, the impact of the Bill of Rights was limited. In the 1833 case of Barron v. Baltimore, the
Supreme Court ruled that the Bill of Rights only protects individuals from the national, and not the state,
governments.

The First Ten Amendments

I. Freedom of religion, speech, and the press, and the right of assembly and to petition government

The First Amendment prohibits Congress from creating an established church. It has been interpreted to
forbid government support for religious doctrines. The amendment also prohibits Congress from passing
laws to restrict worship, speech, the press, or to prevent people from assembling peacefully. In addition,
Congress may not prevent people from petitioning government.

II. Right to bear arms

The Second Amendment has been interpreted by some to give citizens the right to possess firearms.
Others believe it grants the states the right to maintain their own militias.

III. Billeting of soldiers

The Third Amendment forbids the government from housing soldiers in homes in peacetime without their
owners' consent.

IV. Searches and seizures

The Fourth Amendment requires legal authorities to obtain a search warrant before conducting a search
of a person's possessions.

V. Rights in criminal cases

The Fifth Amendment says that no one can be tried for a federal crime unless he or she is indicted by a
grand jury, a group of citizens who decided whether there is sufficient evidence to put the person on
trial. The amendment also states that a person cannot be tried twice for the same offense (unless the
jury fails to reach a verdict). The amendment guarantees that individuals cannot be required to testify
Page 86

against themselves and cannot be deprived of "life liberty or property, without due process of law." The
amendment also forbids government from taking a person's property for public use without fair payment.
VI. Rights to a fair trial
The Sixth Amendment guarantees a person accused of a crime the right to a "speedy and public trial, by
an impartial jury" in the jurisdiction where the alleged crime was committed. The Amendment also
guarantees that accused persons will be informed of the charges against them and that they have the
right to cross-examine witnesses and to have a lawyer to defend them.
VII. Rights in civil cases

The Seventh Amendment guarantees the right to a jury trial in civil lawsuits.

VIII. Bails, fine, and punishments

The Eighth Amendment prohibits excessive bail or fines and "cruel and unusual" punishments.

IX. Rights retained by the people

The Ninth Amendment ensures that rights unmentioned in the Bill of Rights are protected.

X. Powers retained by the states and the people

The Tenth Amendment ensures that the powers not delegated to the federal government are retained by
the states and the people.
Page 87

The Constitution and Slavery

On the 200th anniversary of the U.S. Constitution, Thurgood Marshall, the first African American to sit on the
Supreme Court, said that the Constitution was "defective from the start." He pointed out that the framers had left
out a majority of Americans when they wrote the phrase, "We the People." While some members of the
Constitutional Convention voiced "eloquent objections" to slavery, Marshall said they "consented to a document
which laid a foundation for the tragic events which were to follow."

The word "slave" does not appear in the Constitution. The framers consciously avoided the word, recognizing that it
would sully the document. Nevertheless, slavery received important protections in the Constitution. The notorious
Three-fifths clause--which counted three-fifths of the slave population in apportioning representation--gave the
South extra representation in the House and extra votes in the Electoral College. Thomas Jefferson would have lost
the election of 1800 if not for the Three-fifths compromise. The Constitution also prohibited Congress from outlawing
the Atlantic slave trade for twenty years. A fugitive slave clause required the return of runaway slaves to their
owners. The Constitution gave the federal government the power to put down domestic rebellions, including slave
insurrections.

The framers of the Constitution believed that concessions on slavery were the price for the support of southern
delegates for a strong central government. They were convinced that if the Constitution restricted the slave trade,
South Carolina and Georgia would refuse to join the Union. But by sidestepping the slavery issue, the framers left the
seeds for future conflict. After the convention approved the great compromise, Madison wrote: "It seems now to be
pretty well understood that the real difference of interests lies not between the large and small but between the
northern and southern states. The institution of slavery and its consequences form the line of discrimination."

Of the 55 Convention delegates, about 25 owned slaves. Many of the framers harbored moral qualms about slavery.
Some, including Benjamin Franklin (a former slave-owner) and Alexander Hamilton (who was born in a slave colony
in the British West Indies) became members of antislavery societies.

On August 21, 1787, a bitter debate broke out over a South Carolina proposal to prohibit the federal government
from regulating the Atlantic slave trade. Luther Martin of Maryland, a slaveholder, said that the slave should be
subject to federal regulation since the entire nation would be responsible for suppressing slave revolts. He also
considered the slave trade contrary to America's republican ideals. "It is inconsistent with the principles of the
Revolution," he said, "and dishonorable to the American character to have such a feature in the Constitution."

John Rutledge of South Carolina responded forcefully. "Religion and humanity have nothing to do with this question,"
he insisted. Unless regulation of the slave trade was left to the states, the southern-most states "shall not be parties
to the union." A Virginia delegate, George Mason, who owned hundreds of slaves, spoke out against slavery in
ringing terms. "Slavery," he said, "discourages arts and manufactures. The poor despise labor when performed by
slaves." Slavery also corrupted slaveholders and threatened the country with divine punishment: "Every master of
slaves is born a petty tyrant. They bring the judgment of Heaven on a country."

Oliver Ellsworth of Connecticut accused slaveholders from Maryland and Virginia of hypocrisy. They could afford to
oppose the slave trade, he claimed, because "slaves multiply so fast in Virginia and Maryland that it is cheaper to
raise then import them, whilst in the sickly rice swamps [of South Carolina and Georgia] foreign supplies are
necessary." Ellsworth suggested that ending the slave trade would benefit slave-owners in the Chesapeake region,
since the demand for slaves in other parts of the South would increase the price of slaves once the external supply
was cut off.

The controversy over the Atlantic slave trade was ultimately settled by compromise. In exchange for a 20-year ban
on any restrictions on the Atlantic slave trade, southern delegates agreed to remove a clause restricting the national
government's power to enact laws requiring goods to be shipped on American vessels (benefiting northeastern
shipbuilders and sailors). The same day this agreement was reached, the convention also adopted the fugitive slave
clause, requiring the return of runaway slaves to their owners.

Was the Constitution a proslavery document, as abolitionist William Lloyd Garrison claimed when he burned the
document in 1854 and called it "a covenant with death and an agreement with Hell"? This question still provokes
controversy. If the Constitution temporarily strengthened slavery, it also created a central government powerful
enough to eventually abolish the institution.
Page 88

The Equal Rights Amendment

In March 1972, the Congress passed an Equal Rights Amendment (ERA) to the United States Constitution,
prohibiting
sex discrimination, with only 8 dissenting votes in the Senate and 24 votes in the House. Before the year was over,
22 state legislatures ratified the ERA. Ratification by 38 states was required before the amendment would be added
to the Constitution. Over the next five years, only 13 more states ratified the amendment--and 5 states rescinded
their ratification. In 1978, Congress gave proponents of the amendment 39 more months to complete ratification,
but no other state gave its approval.

The ERA had been defeated, but why? Initially, opposition came largely from organized labor, which feared that the
amendment would eliminate state "protective legislation" that established minimum wages and maximum hours for
women workers. Increasingly, however, resistance to the amendment came from women of lower economic and
educational status, whose self-esteem and self-image were bound up with being wives and mothers and who wanted
to ensure that women who devoted their lives to their families were not accorded lower status than women who
worked outside the home.

The leader of the anti-ERA movement was Phyllis Schlafly, a Radcliffe-educated mother of six from Alton, Illinois. A
larger than life figure, Schlafly earned a law degree at the age of 54, wrote nine books (including the 1964 best-seller
A Choice Not an Echo), and created her own lobbying group, the Eagle Forum. Schlafly argued that the ERA was
unnecessary because women were already protected by the Equal Pay Act of 1963 and the Civil Rights Act of 1964,
which barred sex discrimination, and that the amendment would outlaw separate public restrooms for men and
women and would deny wives the right to financial support. She also raised the "women in combat" issue by
suggesting that the passage of the ERA would mean that woman would have to fight alongside men during war.
Page 89

The Great Depression in Global Perspective

The Great Depression was a global phenomenon, unlike previous economic downturns which generally were confined
to a handful of nations or specific regions. Africa, Asia, Australia, Europe, and North and South America all suffered
from the economic collapse. International trade fell 30 percent as nations tried to protect their industries by raising
tariffs on imported goods. "Beggar-thy-neighbor" trade policies were a major reason why the Depression persisted
as as it did. By 1932, an estimated 30 million people were unemployed around the world.
long

Also, in contrast to the relatively brief economic "panics" of the past, the Great Depression dragged on with no end in
sight. As the depression deepened, it had far-reaching political consequences. One response to the depression was
military dictatorship--a response that could be found in Argentina and in many countries in Central America. Western
industrialized countries cut back sharply on the purchase of raw materials and other commodities. The price of coffee,
cotton, rubber, tin, and other commodities dropped 40 percent. The collapse in raw material and agricultural
commodity prices led to social unrest, resulting in the rise of military dictatorships that promised to maintain order.

A second response to the Depression was fascism and militarism--a response found in Germany, Italy, and Japan. In
Germany, Adolph Hitler and his Nazi Party promised to restore the country's economy and to rebuild its military. After
becoming chancellor in 1932, Hitler outlawed labor unions, restructured German industry into a series of cartels, and
after 1935, instituted a massive program of military rearmament that ended high unemployment. In Italy, fascism
arose even before the Depression's onset under the leadership of Italian dictator Benito Mussolini. In Japan,
militarists seized control of the government during the 1930s. In an effort to relieve the Depression, Japanese
military officers conquered Manchuria, a region rich in raw materials, and coastal China in 1937.

A third response to the Depression was totalitarian communism. In the Soviet Union, the Great Depression helped
solidify Joseph Stalin's grip on power. In 1928, Stalin instituted a planned economy. His First Five Year Plan called for
rapid industrialization and "collectivization" of small peasant farms under government control. To crush opposition to
his program, which required peasant farmers to give their products to the government at low prices, Stalin exiled
millions of peasant to labor camps in Siberia and instituted a program of terror called the Great Purge. Historians
estimate that as many as 20 million Soviets died during the 1930s as a result of famine and deliberate killings.

A final response to the Depression was welfare capitalism, which could be found in countries including Canada, Great
Britain, and France. Under welfare capitalism, government assumed ultimate responsibility for promoting a
reasonably fair distribution of wealth and power and for providing security against the risks of bankruptcy,
unemployment, and destitution.

Compared to other industrialized countries, the economic decline brought on by the Depression was steeper and
more protracted in the United States. The unemployment rate rose higher and remained higher longer than in any
other western society. European countries significantly reduced unemployment by 1936. However, the American
jobless rate still exceeded 17 percent as late as 1939, when World War II began in Europe. It did not drop below 14
percent until 1941.

The Great Depression transformed the American political and economic landscape. It produced a major political
realignment, creating a coalition of big city ethnics, African Americans and Southern Democrats committed, to
varying degrees, to interventionist government. The Depression strengthened the federal presence in American life,
producing such innovations as national old age pensions, unemployment compensation, aid to dependent children,
public housing, federally subsidized school lunches, insured bank deposits, the minimum wage, and stock market
regulation. It fundamentally altered labor relations, producing a revived labor movement and a national labor policy
protective of collective bargaining. It transformed the farm economy by introducing federal price supports and rural
electrification. Above all, the Great Depression produced a fundamental transformation in public attitudes. It led
Americans to view the federal government as the ultimate protector of public well-being
Page 90

The Holocaust

At 3 p.m., January 27, 1945, Russian troops of the 100th and 107th divisions entered Auschwitz, a village in
southern Poland 30 miles west of Krakow. There, inside Auschwitz's concentration camps, they found 7,600 inmates-
-including Otto Frank, the father of Anne Frank. The discovery of the concentration camps revealed World War II's
most terrible secret: the Holocaust. Two days later, the U.S. 7th Army liberated Dachau, another notorious Nazi
death camp located outside of Munich. The liberators could scarcely believe what they saw: starving prisoners with
bones protruding from their skin and serial numbers tattooed on their arms; stacks of half-burned corpses; and piles
of human hair.

Auschwitz was not the first Nazi concentration camp--that dubious distinction belonged to Dachau, which was set up
in 1933--but it was the most infamous. At Auschwitz, 1.6 million people died. Of the victims, 1.3 million were Jews
and 300,000 were Gypsies, Polish Catholics, and Russian prisoners of war. Altogether, people from 28 nations lost
their lives there, including the disabled, homosexuals, political prisoners, and other deemed unfit to survive by Adolf
Hitler's Third Reich.

Auschwitz had two main areas. "Auschwitz I" contained a gas chamber and a crematorium and provided housing for
prisoners used in slave labor and in Dr. Josef Mengele's medical experimentation station (where one experiment
involved seeing how long babies survived without food).

"Auschwitz II-Birkenau" contained four gas chambers and crematoria. It was here that cattle cars dumped their
exhausted passengers. Prisoners entered through a gate inscribed with the infamous words "Work Will Make You
Free." SS guards directed each new arrival to the left or to the right. The healthy and strong went to the right. The
weak, the elderly, and the very young went up a ramp to the left--to the gas chambers, disguised as showers.
Inmates were told that the showers were used to disinfect them, but they contained no plumbing, and the shower
heads were fake. Guards injected Zyklon B through openings in the ceilings and walls, then cremated the bodies. The
ashes were used as road filler and fertilizer or simply dumped into surrounding ponds and fields.

Auschwitz was a product of Adolf Hitler's demented belief that Germans constituted a master race which had a right
to kill those they deemed inferior. "Nature is cruel, therefore, we too may be cruel," Hitler stated in 1934. "If I can
send the flower of the German nation into the hell of war...then surely I have a right to remove millions of an inferior
race that breeds like vermin!"

In 1941 and 1942, the Nazi fuehrer initiated the "Final Solution to the Jewish Problem." The Nazis did their best to
disguise their murderous scheme behind euphemisms and camouflage, but sometimes the truth slipped out. Heinrich
Himmler, the official in charge of carrying out the final solution, explained to his top officers: "In public we will never
speak of it. I am referring to the annihilation of the Jewish people. In our history, this is an unwritten and never-to-
be written page of glory."

In the spring of 1944, four prisoners escaped from Auschwitz, carrying tangible proof of the Nazi's systematic
program of mass murder. In mid-July, American and British leaders learned what was happening at Auschwitz, but
they rejected pleas to bomb the gas chambers or the roads and rail lines leading to the camps. Military officials
opposed the bombing because it would divert "considerable air support essential to the success of our forces now
engaged in decisive operations."

This was not the first time that Western help failed to come. During the 1930s, the U.S. State Department blocked
efforts by Jewish refugees to migrate to the United States. Between 1933 and 1945, the United States allowed only
132,000 Jewish refugees to enter the country, just 10 percent of the quota allowed by law. This opposition to Jewish
immigration, in turn, reflected widespread anti-Semitism. As late as 1939, opinion polls indicated that 53 percent of
Americans agreed with the statement, "Jews are different and should be restricted." In the end, less than 500,000
Jews (out of 6.5 million) survived in Nazi-occupied Europe.

The Holocaust was a singular and unique event in human history. Never before had a sovereign state, with the
cooperation of bureaucrats, industrialists, and civilians, sought systematically to exterminate an entire people. Yet
many wonder whether Auschwitz's terrible lesson has been learned. Despite the establishment of Israel, improved
Christian-Jewish relations, and heightened sensitivity to racism, many remain ignorant of the past. A 1995 opinion
poll found that 5 percent of Americans deny that the Holocaust occurred and 10 percent express doubts or ignorance.
More than half a century after the liberation of Auschwitz, "ethnic cleansing" and the persecution of religious, racial,
and ethnic groups continues in Bosnia, China, Guatemala, India, Sri Lanka, Turkey, and elsewhere.
Page 91

The "New" Hollywood

As the 1960s began, few would have guessed that the decade would be one of the most socially conscious and
stylistically innovative in Hollywood's history. Among the most popular films at the decade's start were Doris Day
romantic comedies like That Touch of Mink (1962) and epic blockbusters like The Longest Day (1962), Lawrence of
Arabia (1962), and Cleopatra (1963). Yet, as the decade progressed, Hollywood radically shifted focus and began to
produce an increasing number of anti-establishment films, laced with social commentary, directed at the growing
youth market.

By the early 1960s, an estimated 80 percent of the film-going population was between the ages of 16 and 25. At first,
the major studios largely ignored this audience, leaving it the hands of smaller studios like American International
Pictures, which produced a string of cheaply made horror movies, beach blanket movies--like Bikini Beach (1964) and
How to Stuff a Wild Bikini (1965)--and motorcycle gang pictures--like The Wild Angels (1966). Two films released in
1967--Bonnie and Clyde and The Graduate--awoke Hollywood to the size and influence of the youth audience. Bonnie
and Clyde, the story of two depression era bank robbers, was advertised with the slogan: "They're young, they're in
love, they kill people." Inspired by such French New Wave pictures as Breathless (1960), the film aroused intense
controversy for romanticizing gangsters and transforming them into social rebels. A celebration of youthful rebellion
also appeared in The Graduate, which was the third-highest grossing film up until this time. In this film, a young
college graduate rejects a hypocritical society and the traditional values of his parents--and the promise of a career in
"plastics"--and finds salvation in love.

A number of most influential films of the late '60s and early '70s sought to revise older film genres--like the war film,
the crime film, and the western--and rewrite Hollywood's earlier versions of American history from a more critical
perspective. Three major war films--Little Big Man, Patton, and M*A*S*H-- reexamined the nineteenth-century
Indian wars, World War II, and the Korean War in light of America's experience in Vietnam. The Wild Bunch (1969)
and McCabe and Mrs. Miller (1971) offered radical reappraisals of the mythology of the American frontier. Francis
Ford Coppola's The Godfather (1972) revised and enhanced the gangster genre by transforming it into a critical
commentary on an immigrant family's pursuit of the American dream.

During the mid- and late-70s, the mood of American films shifted sharply. Unlike the highly politicized films of the
early part of the decade, the most popular films of the late 1970s and early 1980s were escapist blockbusters like
Star Wars (1977), Superman (1978), and Raiders of the Lost Ark (1981)-- featuring spectacular special effects,
action, and simplistic conflicts between good and evil--inspirational tales of the indomitable human spirit, like Rocky
(1976)--or nostalgia for a more innocent past--like Animal House (1978) and Grease (1978).Glamorous outlaws like
Bonnie and Clyde were replaced by law and order avengers like Dirty Harry and Robocop. Sports--long regarded as a
sure box officer loser--became a major Hollywood obsession, with movies like Hoosiers, Chariots of Fire, Karate Kid,
and The Mighty Ducks celebrating competitiveness and victory. Movies which offered a tragic or subversive
perspectives on American society, like The Godfather or Chinatown, were replaced by more upbeat, undemanding
films, and especially by comedies, featuring such actors as Dan Ackroyd, Chevy Chase, Eddie Murphy, and Bill
Murray.

Critics partly blamed the trend toward what Mark Crispin Miller has called "deliberate anti-realism" upon economic
changes within the film industry. In 1966, Gulf and Western Industries executed a takeover of Paramount and the
conglomerization of the film industry began. In 1967, United Artists merged with Transamerica Corporation; in 1969
Kinney Services acquired Warner Brothers. In one sense the takeovers were logical. Conglomerates wanted to
acquire interests in businesses that serviced Americans' leisure needs. The heads of the conglomerates, however,
had no idea how to make successful motion pictures. Too often they believed that successful movies could be mass
produced, that statisticians could discover a scientific method for making box office hits.

A trend toward the creation of interlocking media companies, encompassing movies, magazines, and newspapers,
and books accelerated in 1985 when the Department of Justice overturned the 1948 anti-trust decree which had
ended vertical integration within the film industry. As a result, many of the major studios were acquired by large
media and entertainment corporations, like Sony, which purchased Columbia Pictures, Time Warner (which owns
Time magazine, Simon & Schuster publishers, and Warner Brothers), and Rupert Murdoch, whose holdings include
HarperCollins publishers, the Fox television network, and Twentieth Century Fox. At the same time that these large
entertainment conglomerates arose, many smaller independent producers like Lorimar and De Laurentiis,
disappeared.

Nevertheless, important issues continued to be addressed through film. Many films focused on problems of romance,
family, gender, and sexuality--aspects of life radically changed by the social transformations of the 1960s and early
Page 92

1970s. Certainly, some films tried to evade the profound changes that had taken place in gender relations--like An
Officer and a Gentleman, an old-fashioned screen romance--or Flashdance--an updated version of the Cinderella
story--or 10 and Splash--which depict male fantasies about relationships with beautiful, utterly compliant women. But
many other popular films addressed such serious questions as the conflict between the family responsibilities and
personal needs (for example, Kramer v. Kramer) or women's need to develop their independence (like An Unmarried
Woman, Desperately Seeking Susan, and Thelma and Louise).

At a time when politicians and news journalists were neglecting racial and urban issues, movies like Boyz in the
Hood, Grand Canyon, Do the Right Thing, and Jungle Fever focused on such problems as the racial gulf separating
blacks and whites, the conditions in the nation's inner cities, the increasing number of poor single parent families,
police brutality, and urban violence.
Ironically, the most controversial issue of the 1960s and early 1970s, the Vietnam War, only began to be seriously
examined on the screen in the late '70s. Although many films of the late 60s and early 70s embodied the bitter
aftertaste of the war, the conflict itself remained strikingly absent from the screen, as Hollywood, like the country as
a whole, had difficulty adjusting to the grim legacy of a lost and troubling war. During the conflict, Hollywood
produced only a single film dealing with Vietnam--John Wayne's The Green Berets. Modeled along the lines of such
World War II combat epics as The Sands of Iwo Jima and earlier John Wayne westerns like The Alamo, the film
portrayed decent Americans struggling to defend an embattled outpost along the Laotian border nicknamed Dodge
City.

Although America's active military participation in the Vietnam War ended in 1973, the controversy engendered by
the war raged on long after the firing of the last shot. Much of the controversy centered on the returning veterans.
Veterans were shocked by the cold, hostile reception they received when they returned to the United States. In First
Blood (1982), John Rambo captured the pain of the returning veterans: "It wasn't my war-- you asked me, I didn't
ask you...and I did what I had to do to win....Then I came back to the world and I see all those maggots at the
airport, protesting me, spitting on me, calling me a baby- killer...."

During the 1970s and '80s, the returning Vietnam War veteran loomed large in American popular culture. He was
first portrayed as a dangerous killer, a deranged ticking time bomb that could explode at any time and in any place.
He was Travis Bickle in Taxi Driver (1976), a veteran wound so tight that he seemed perpetually on the verge of
snapping. Or he was Colonel Kurtz in Apocalypse Now (1979), who adjusted to a mad war by going mad himself.

Not until the end of the '70s did popular culture begin to treat the Vietnam War veteran as a victim of the war rather
than a madman produced by the war. Coming Home (1978) and The Deer Hunter (1978) began the popular
rehabilitation of the veteran, and such films as Missing in Action (1984) and Rambo: First Blood II (1985)
transformed the veteran into a misunderstood hero. Where some films, like the Rambo series, focused on the exploits
of one-man armies or vigilantes armed to the teeth, who had been kept from winning the war because of government
cowardice and betrayal, another group of Vietnam War films--like Platoon, Casualties of War, and Born on the Fourth
of July--took quite a different view of the war. Focusing on innocent, naive "grunts"--the ground troops who actually
fought the war--these movies retold the story of the Vietnam War in terms of the soldiers' loss of idealism, the
breakdown of unit cohesion, and the struggle to survive and sustain a sense of humanity and integrity in the midst of
war.
Page 93

The Revolution of 1800

In 1800, the nation again had a choice between John Adams and Thomas Jefferson. Federalists feared that Jefferson
would return power to the states, dismantle the army and navy, and overturn Hamilton's financial system. The
Republicans charged that the Federalists, by creating a large standing army, imposing heavy taxes, and using
federaland the federal courts to suppress dissent, had shown contempt for the liberties of the American people. They
troops
worried that the Federalists' ultimate goal was to centralize power in the national government and involve the United
States in the European war on the side of Britain.

Jefferson's Federalist opponents called him an "atheist in religion, and a fanatic in politics." They claimed he was a
drunkard and an enemy of religion. The Federalist Connecticut Courant warned that "there is scarcely a possibility
that we shall escape a Civil War. Murder, robbery, rape, adultery, and incest will be openly taught and practiced."

Jefferson's supporters responded by charging that President Adams was a monarchist who longed to reunite Britain
with its former colonies. Republicans even claimed that the president had sent General Thomas Pinckney to England
to procure four mistresses, two for himself and two for Adams. Adams's response: "I do declare if this be true,
General Pinckney has kept them all for himself and cheated me out of my two."

The election was extremely close. It was the Constitution's Three-fifths clause, which counted three-fifths of the slave
population in apportioning representation, that gave the Republicans a majority in the Electoral College. Jefferson
appeared to have won by a margin of eight electoral votes. But a complication soon arose. Because each Republican
elector had cast one ballot for Jefferson and one for Burr, the two men received exactly the same number of electoral
votes.

Under the Constitution, the election was now thrown into the Federalist-controlled House of Representatives. Instead
of emphatically declaring that he would not accept the presidency, Burr declined to say anything. So, the Federalists
faced a choice. They could help elect the hated Jefferson--"a brandy-soaked defamer of churches"--or they could
throw their support to the opportunistic Burr. Hamilton disliked Jefferson, but he believed he was a far more
honorable man than Burr, whose "public principles have no other spring or aim than his own aggrandizement."

As the stalemate persisted, Virginia and Pennsylvania mobilized their state militias. Recognizing, as Jefferson put it,
"the certainty that a legislative usurpation would be resisted by arms," the Federalists backed down. After six days of
balloting and 36 ballots, the House of Representatives elected Thomas Jefferson the third president of the United
States. And as a result of the election, Congress adopted the Twelfth Amendment to the Constitution, which gives
each elector in the Electoral College one vote for president and one for vice president.
Page 94

THE SUPREME COURT

What is the Supreme Court?

The Supreme Court is the highest court in the land. Originally it had five members but over
time this number has increased. Since 1869, it has consisted of nine Justices: the Chief Justice
of the United States and eight Associate Justices. They have equal weight when voting on a
case and the Chief Justice has no casting vote or power to instruct colleagues. Decisions are
made by a simple majority.

Below the Supreme Court, there is a system of Courts of Appeal, and, below these courts,
there are District Courts. Together, these three levels of courts represent the federal judicial
system.

Who is eligible to become a member of the Court?

The Constitution does not specify qualifications for Justices such as age, education, profession,
or native-born citizenship. A Justice does not have to be a lawyer or a law school graduate, but
all Justices have been trained in the law. Many of the 18th and 19th century Justices studied
law under a mentor because there were few law schools in the country.

The last Justice to be appointed who did not attend any law school was James F. Byrnes (1941-
1942). He did not graduate from high school and taught himself law, passing the bar at the
age of 23.

All Supreme Court judges are appointed for life.

How is a member of the Court chosen?

The Justices are nominated by the President and confirmed with the 'advice and consent' of the
Senate. As federal judges, the Justices serve during "good behavior", meaning essentially that
they serve for life and can be removed only by resignation or by impeachment and subsequent
conviction.

Since the Supreme Court makes so many 'political' decisions and its members are appointed so
rarely, the appointment of Justices by the President is often a very charged and controversial
matter. Since Justices serve for life and therefore usually beyond the term of office of the
appointing President, such appointment are often regarded as an important part of any
particular President's legacy.

What are the powers of the Court?

The Supreme Court is the highest court in the United States. The court deals with matters
pertaining to the federal government, disputes between states, and interpretation of the
Constitution.

It can declare legislation or executive action made at any level of the government as
unconstitutional, nullifying the law and creating precedent for future law and decisions.

However, the Supreme Court can only rule on a lower court decision so it cannot take the
initiative to consider a matter.

There are three ways that a matter can come to the Supreme Court:
Page 95

1. A federal authority makes a decision that is challenged as unconstitutional which goes


straight to the Supreme Court which is not obliged to take it
2. A state makes a decision which someone believes is unconstitutional but the matter would
have to have previously been heard by a Federal Court of Appeal (there are 11 circuits
covering the 50 states)
3. There is a conflict between states that needs to be resolved (if the two or more states are
in the same circuit, the matter would first have to go to the appropriate Federal Court of
Appeal)
Other interesting facts about the Court

Each year, around 8,000 petitions are made to the Supreme Court seeking a judgement,
but each term the number of cases determined is only about 100.

When a case is considered in public by the Court, each side of the case only has half-an-
hour to state its position. All the detail is set out in documents and all the rest of the time
of the public hearing is taken up by questions from the Justices.

Decisions of the Supreme Court are taken in private conference, following discussion and
debate. No Justice speaks for a second time until every Justice has spoken once.

Given how difficult it is to change the US Constitution through the formal method, one has
seen informal changes to the Constitution through various decisions of the Supreme Court
which have given specific meanings to some of the general phases in the Constitution. It
is one of the many ironies of the American political system that an unelected and
unaccountable body like the Supreme Court can in practice exercise so much political
power in a system which proclaims itself as so democractic.

The Supreme Court in practice therefore has a much more 'political' role than the highest
courts of European democracies. In the 1960s, the court played a major role in bringing
about desegregation. The scope of abortion in the USA is effectively set by the Supreme
Court whereas, in other countries, it would be set by legislation. Indeed in 2000, it made
the most political decision imaginable by determining - by seven votes to two - the
outcome of that year's presidential election. It decided that George W Bush had beaten Al
Gore, although Gore won the most votes overall.

A recent and momentous instance of this exercise of political power was the Supreme
Court decision in the case of the challenge to Barack Obama's signature piece of
legislation, the Patient Protection and Affordable Care Act, often dubbed Obamacare. No
less than 26 states challenged the legality of these health reforms under a clause in the
constitution governing interstate commerce. In the end, the Court ruled by five to four
that, while the individual mandate provision in the Act is not itself a tax, the penalties
imposed for not buying health insurance do represent taxes and therefore the entire
requirement falls within the remit of Congress's right to impose taxes.

William Howard Taft (1857-1930) was the 27th President of the United States (1909-
1913) and later the tenth Chief Justice of the United States (1921-1930). He is the only
person to have served in both of these offices.

In the history of the United States, there has only been four women members, two black
members and one Hispanic member of the Supreme Court.
Page 96

The present membership of the Supreme Court includes three women members and one
black member. Of the nine members, five are Catholic and three are Jewish while one -
Neil Gorsuch - was raised as a Catholic but attends a Protestant church.

Following the appointment by President Trump of Neil Gorsuch to the Supreme Court,
there is now a five to four conservative-liberal majority on the court. All the conservative
members were appointed by Republican presidents, while all the liberals were appointed
by Democratic presidents.

A special feature of the American political system in respect of the judiciary is that,
although federal judges are appointed, nationwide 87% of all state court judges are
elected and 39 states elect at least some of their judges. Outside of the United States,
there are only two nations that have judicial elections and then only in limited fashion.
Smaller Swiss cantons elect judges and appointed justices on the Japanese Supreme
Court must sometimes face retention elections (although those elections are a formality).
Page 97

The U.S. Constitution and the Organization of the National Government

The U.S. Constitution created a system of checks and balances and three independent branches of government.

The Legislative Branch

Article I of the Constitution established Congress. The framers of the Constitution expected Congress to
be the dominant branch of government. They placed it first in the Constitution and assigned more
powers to it than to the presidency. Congress was given "all legislative powers," including the power to
raise taxes, coin money, regulate interstate and foreign commerce, promote the sciences and the arts,
and declare war.

The Executive Branch

Article II of the Constitution created the presidency. The president's powers were stated more briefly
than those of Congress. The president was granted "Executive Power," including the power "with the
Advice and Consent of the Senate," to make treaties and appoint ambassadors. The president was also
to serve as Commander in Chief of the army and navy.

In delegate James Wilson's view, the presidency was "the most difficult [issue] of all on which we have
had to decide." Americans had waged a revolution against a king and did not want concentrated power
to appear in another guise. The delegates had to decide whether the chief executive should be one
person or a committee; whether the president should be appointed by Congress; and how long the chief
executive should serve.

On August 18, 1787, a Pennsylvania newspaper carried a leaked report from the Constitutional
Convention. It was the first word on the proceedings that directly quoted a delegate. "We are well
informed" of "reports idly circulating, that it is intended to establish a monarchical government.... Tho'
we cannot, affirmatively, tell you what we are doing, we can, negatively, tell you what we are not doing-
-we never once thought of a king."

The conflict with royal governors had made the public deeply distrustful of powerful executives.
Alexander Hamilton argued for a chief executive to be given broad powers and elected for life. Edmund
Randolph of Virginia thought executive power should not be put into the hands of a single person since a
single executive would be "the fetus of monarchy."

To ensure a check on presidential power, Congress was given the power to override a presidential veto
and to impeach and remove a president. Congress alone was given the power to declare war.

The Judicial Branch

Article III of the Constitution established a Supreme Court.

The Constitution does not specify the size of the Supreme Court. Over the years the designated size of
the Supreme Court has varied between six, seven, nine, and even ten members. Nor does the
Constitution explicitly grant the courts the power of judicial review--to determine whether legislation is
consistent with the Constitution.

Today, no other country makes as much use of judicial review as the United States. Many of our society's
policies on racial desegregation, criminal procedure, abortion, and school prayer are the product of court
decisions. The concept of judicial review was initially established on the state level and in the debates
over the ratification of the Constitution.

In contrast to Britain, American judges do not wear wigs. When the Supreme Court held its first session
in 1790, one justice did arrive wearing a wig. But the public expressed derision at wig wearing, and the
justice decided that republican judges should not wear wigs.

Voting Rights

The Constitution included no property qualifications for voting or officeholding like those found in the
Page 98

state constitutions drafted between 1776 and 1780. In a republican society, officeholding was supposed
to reflect personal merit, not social rank.

The Constitution did not bar anyone from voting. It only said that voting for members of the House of
Representatives should be the same in each state as that state's requirements for voting for the most
numerous branch of the legislature. In order words, qualifications for voting were left to the individual
states. The New Jersey constitution allowed women to vote if they met the same property requirements
as men.
Page 99

The Vietnam War

The prize-winning photographs are among the most searing and painful images of the Vietnam War era. These
images helped define the meaning of the war. They also illustrate the immense power of photography to reveal war's
brutality.

One photograph shows a Buddhist monk calmly burning himself to death to protest the U.S.-backed South
Vietnamese government. Photographs of this horrific event raised a public outcry against the corruption and religious
discrimination of the government of Ngo Dinh Diem, the Catholic president of South Vietnam. Eight more monks and
nuns immolated themselves in the following months.

Another photograph shows a 9-year-old girl, running naked and screaming in pain after a fiery napalm attack on her
village. The napalm (jellied gasoline) has burned through her skin and muscle down to her bone. The photograph of
her anguished, contorted face helped to end American involvement in the Vietnam War.

A third image shows a stiff-armed South Vietnamese police chief about to shoot a bound Viet Cong prisoner in the
head. The victim, a Viet Cong lieutenant, was alleged to have wounded a police officer during North Vietnam's Tet
offensive of 1968. The photograph became a symbol of the war's casual brutality.

A fourth photograph, taken by a 21-year-old college journalist, shows the body of a 20-year-old student protestor at
Ohio's Kent State University lying limp on the ground, shot to death by National Guardsmen. In the center of the
picture, a young woman kneels over the fallen student, screaming and throwing up her arms in agony.

A fifth picture captured the fall of Saigon during the last chaotic days of the Vietnam War. The photo shows
desperate Vietnamese crowding on the roof of the U.S. Agency for International Development building trying to
board a silver Huey helicopter. Taken on April 30, 1975, the photograph captured the moment when the last U.S.
officials abandoned South Vietnam, and South Vietnamese military and political leaders fled their own country, while
hundreds of Vietnamese left behind raise their arms helplessly.

Photographs have the power to capture an event and burn it into our collective memory. Photographs can trap
history in amber, preserving a fleeting moment for future generations to re-experience. Photographs can evoke
powerful emotions and shape the way the public understands the world and interprets events. Each of these pictures
played a role in turning American public opinion against the Vietnam War. But pictures never tell the full story. By
focusing on a single image, they omit the larger context essential for true understanding.

Phan Thi Kim Phuc, the 9-year-old girl running naked down the road in the photograph, was born in 1963 in a small
village in South Vietnam's Central Highlands. Kim Phuc was the daughter of a rice farmer and a cook. In June 1972,
she and her family took refuge in a Buddhist temple when South Vietnamese bombers flew over her village. Four
bombs fell toward her. The strike was a case of friendly fire, the result of a mistake by the South Vietnamese air
force.

There was an orange fireball, and Kim Phuc was hit by napalm. Her clothes were vaporized; her ponytail was sheared
off by the napalm. Her arms, shoulders, and back were so badly burned that she needed 17 major operations. She
started screaming, "Nong qua! Nong qua!” (too hot!) as she ran down the road. Her scarring is so severe that she will
not wear short-sleeve shirts to this day. She still suffers from severe pain from the burns, which left her without
sweat or oil glands over half of her body.

Two infant cousins died in the attack, but Kim Phuc, her parents, and seven siblings survived. The man who took her
photograph, 21-year-old Huynh Cong "Nick" Ut, was also Vietnamese; his brother was killed while covering combat in
Vietnam's Mekong Delta for the Associate Press. After the napalm attack, Ut put her into a van and rushed her to a
South Vietnamese hospital, where she spent 14 months recovering from her burns.

In 1986, Kim Phuc (whose name means "Golden Happiness") persuaded the Vietnamese government to allow her to
go to Cuba to study pharmacology. In 1992, while in Cuba, she met and married a fellow Vietnamese student. Later
that year, she and her husband defected to Canada while on a flight from Cuba to Moscow. Today, she serves as an
unpaid goodwill ambassador for UNESCO and runs a non-profit organization that provides aid to child war victims.
Her husband cares for mentally disabled adults.

Gen. Nguyen Ngoc Loan, the South Vietnamese police chief who executed the Viet Cong prisoner in 1968, had a
reputation for ruthlessness. While serving as a colonel in 1966, he led tanks and armored vehicles into the South
Page 100

Vietnamese city of Danang to suppress rebel insurgents. Hundreds of civilians as well as Viet Cong were killed. In
early 1968, at the height of the Tet offensive, Loan was working around the clock to defend the South Vietnamese
capital of Saigon. He had asked a regimental commander to execute the prisoner, but when the commander
hesitated, Loan said, "'I must do it.' If you hesitate, if you didn't do your duty, the men won't follow you."

The photograph taken at Kent State in Ohio shows a terrified young woman, Mary Ann Vecchio, a 14-year-old
runaway from Florida, kneeling over the body of Jeffrey Miller. Miller, a Kent State University student, had been
protesting American involvement in Vietnam even before attending college. At the age of 15, he had composed a
poem titled "Where Does It End," expressing his horror about "the war without a purpose."

Miller was shot and killed during an anti-war protest that followed the announcement that U.S. troops had moved
into Cambodia. An ROTC building on the university's campus was burned, and in response, the mayor of Kent called
in the National Guard.

On May 4, 1970, the guardsmen threw tear gas canisters at the crowd of student protesters. Students threw the
canisters back along with rocks, the guardsman later claimed. The 28 guardsmen fired more than 60 shots, killing
four students (two of them protesters) and injuring nine (one was left permanently paralyzed).

A Justice Department report determined that the shootings were "unnecessary, unwarranted and inexcusable," but
an Ohio grand jury found that the Guard had acted in self-defense and indicted students and faculty for triggering
the disturbance.
Page 101

Traditional Family Values and Breakdown of the Family

We hear a great deal these days about traditional family values. But precisely when did traditional family values reign
supreme? Certainly not in the world of the Old Testament or classical antiquity. Let's go back 2,000 years and see
what families were like then.

For one thing, the world of classical antiquity had no word for family. This is a case where the Greeks didn't have a
word for it. Wealthy people had families that contained not only spouses and children but servants, slaves, and a
host of relatives and non-kin. At the same time, many slaves had no families at all.

Secondly, women in the ancient world married at or even before puberty. In ancient Greece, the average woman
married between the ages of 12 and 15. Men married much later, usually in their mid or late 20s. A very substantial
age gap between husbands and wives made families very patriarchal. Women usually were not allowed out of the
house unless chaperoned and covered in heavy robes.

Thirdly, the ancient world permitted a wide range of practices that we find abhorrent. The most startling practice was
called "exposure." In ancient Greece and Rome, newborn children were left out of doors, so that handicapped babies
- and many daughters - would die. These aren't the only practices that we would find inconsistent with traditional
family values. Most ancient societies permitted divorce on demand, polygamy, and concubinage - the cohabitation of
people who were not legally married. This was an earlier form of surrogate motherhood.

When, then, did the modern family emerge? When did romantic love become the basis of marriage? When did the
emotionally-intense, child-centered nuclear family appear? When did mothers become the very center of family life?
Surprisingly, the modern family is just 150 years old.

In colonial America marriages were not based on love. Ministers described romantic love as a form of madness and
urged young men to choose their mates on the basis of rational consideration of property and family. Marriages were
often quite brief. In colonial Virginia, an average marriage lasted just seven years. Till death do us part meant
something quite different than it does today.

Families were large - too large to allow parents to give much attention to each child. The average woman bore
between 8 and 10 children. Fathers, not mothers, were the primary parent. Child rearing advice books were
addressed to men, not women.

In colonial America, children were sent away from home at very young ages. Children of just six or seven were sent
to work as servants or apprentices in other peoples' homes. There was no adolescent rebellion when adolescents
didn't live at home.

It was not until the mid 19th century that the family patterns that we call traditional begin to emerge. For example,
it was only during the Victorian era that middle class women began to make motherhood and housekeeping self-
conscious vocations. And it was only in the 19th century that modern household architecture with an emphasis on
personal privacy emerged. It was only then that houses began to have hallways and separate bedrooms.

And yet, we must be careful about assuming that the 19th century was truly an era of traditional family values.
Prostitution was extremely widespread in 19th century America. Our best guess is that between 5 and 10 percent of
young urban women practiced prostitution in 19th century America.

Americans dramatically reduced birthrates during the 19th century - but the major method of birth control was
abortion, which was legal in virtually every jurisdiction before the 1880s.

Above all, families in the 19th century were just as fragile and unstable as families today. The proportion of single-
parent, female-headed families was almost as high in 1900 as it is today - because of the higher death rate.

Today, we tend to assume that families are weaker and more fragile than those in the past. But I think this view is
wrong. From an historical perspective, however, we invest much more emotional and psychological significance in
family life than did our ancestors. We regard family ties and intimacy as the key to our personal happiness. And as a
result, when our family relationships are unhappy or abusive, we get divorced. Our high divorce rate doesn't reflect a
low valuation on marriage; it reflects our overly high hopes and expectations.

Let me make one more point. Our high expectations have also made family life more conflict-riven. We have
eliminated many opportunities to blow off steam and to reduce the intensity of family relationships. Four centuries
ago, only 8 percent of homicides were within the family, compared to 50 percent today.
Page 102

History of women's suffrage in the United States


American women were granted the right to vote with the passage of the 19th amendment to the U.S.
Constitution in 1920. The effort to obtain women's suffrage in the United States was a primary effort of those
involved in the greater women's rights movement of the 19th century. Women's suffrage was permanently
granted in 1920 with the passage of the Nineteenth Amendment to the United States Constitution.

Beginnings
American women were granted the right to vote with the passage of the 19th amendment to the U.S.
Constitution in 1920. During the early part of the century, agitation for equal suffrage was carried on by only a
few individuals. The first of these was Frances Wright, a Scottish woman who came to the country in 1826
and advocated women's suffrage in an extensive series of lectures. In 1836 Ernestine Rose, a Polish
woman, came to the country and carried on a similar campaign, so effectively that she obtained a personal
hearing before the New York Legislature, though her petition bore only five signatures. At about the same
time, in 1840, Lucretia Mott and Margaret Fuller became active in Boston, the latter being the author of the
book The Great Lawsuit; Man vs. Woman. Efforts to gain various women's rights were subsequently led by
women such as Susan B. Anthony, Virginia Minor, Elizabeth Cady Stanton, and Paulina Kellogg Wright
Davis.

Civil War
During the Civil War and immediately after little was heard of the movement, but in 1869 the National Woman
Suffrage Association was formed by Susan B. Anthony and Elizabeth Cady Stanton, with the object of
securing an amendment to the Constitution in favor of woman suffrage, thus opposing passage of the
Fifteenth Amendment without it being changed to include female suffrage.
Another more conservative suffrage organization, the American Woman Suffrage Association, headed by
Lucy Stone, was also formed at this time by those who believed that suffrage should be brought about by
amendments to the various state constitutions. They supported the proposed 15th amendment as written. In
1890, these two bodies united into one national organization, led by Susan B. Anthony and known as the
National American Woman Suffrage Association.

National American Woman Suffrage Association


In 1900, regular national headquarters were established in New York City, under the direction of the new
president Mrs. Carrie Chapman Catt, who was endorsed by Susan B. Anthony after her retirement as first
president. Three years later headquarters were moved to Warren, Ohio, but were then brought back to New
York again shortly afterward, and re-opened there on a much bigger scale. The organization obtained a
hearing before every Congress, from 1869 to 1919.
Page 103

Regional suffrage
New Jersey, on becoming a federal state after the American Revolution, placed only one restriction on the
general suffrage — the possession of at least $250 worth of cash or property. The election laws referred to
voters as "he or she." In 1790, the law was revised to include women specifically. Female voters became so
objectionable to professional politicians, that in 1807 the law was revised to exclude them. Later, the 1844
constitution banned women voting, the 1947 one then allowed it.
The first territorial legislature of the Wyoming Territory granted women suffrage in 1869 [2]. In the following
year, the Utah Territory followed suit. However, in 1887, the United States Congress disenfranchised Utah
women with the Edmunds–Tucker Act. In 1890, Wyoming came into the Union as the first state that allowed
women to vote. In 1893, voters of Colorado made that state the second of the woman suffrage states [3]. In
1895, Utah adopted a constitution restoring the right of woman suffrage. Colorado was the first state where
men voted to give women the right to vote.

Illinois
In 1912, Grace Wilbur Trout, then head of the Chicago Political Equality League, was elected president of the
state organization. Changing her tactics from a confrontational style of lobbying the state legislature, she
turned to building the organization internally. She made sure that a local organization was started in every
Senatorial District. One of her assistants, Elizabeth Booth, cut up a Blue Book government directory and
made file cards for each of the members of the General Assembly. Armed with the names, four lobbyists
went to Springfield to persuade one legislator at a time to support suffrage for women. In 1913, first-term
Speaker of the House, Democrat Champ Clark, told Trout that he would submit the bill for a final vote, if there
was support for the bill in Illinois. Trout enlisted her network, and while in Chicago over the weekend, Clark
received a phone call every 15 minutes, day and night. On returning to Springfield he found a deluge of
telegrams and letters from around the state all in favor of suffrage. By acting quietly and quickly, Trout had
caught the opposition off guard.
After passing the Senate, the bill was brought up for a vote in the House on June 11, 1913. Trout and her
team counted heads and went as far as to fetch needed male voters from their homes. Watching the door to
the House chambers, Trout urged members in favor not to leave before the vote, while also trying to prevent
"anti" lobbyists from illegally being allowed onto the House floor. The bill passed with six votes to spare, 83-
58. On June 26, 1913, Illinois Governor Edward F. Dunne signed the bill in the presence of Trout, Booth and
union labor leader Margaret Healy.
Women in Illinois could now vote for Presidential electors and for all local offices not specifically named in the
Illinois Constitution. However, they still could not vote for state representative, congressman or governor; and
they still had to use separate ballots and ballot boxes. But by virtue of this law, Illinois had become the first
state east of the Mississippi River to grant women the right to vote for President of the United States. Carrie
Chapman Catt wrote:
"The effect of this victory upon the nation was astounding. When the first Illinois election took place in April,
(1914) the press carried the headlines that 250,000 women had voted in Chicago. Illinois, with its large
electoral vote of 29, proved the turning point beyond which politicians at last got a clear view of the fact that
women were gaining genuine political power."
Page 104

Besides the passage of the Illinois Municipal Voting Act, 1913 was also a significant year in other facets of
the women's suffrage movement. In Chicago, African American anti-lynching crusader Ida B. Wells-Barnett
founded the Alpha Suffrage Club, the first such organization for Negro women in Illinois. Although white
women as a group were sometimes ambivalent about obtaining the franchise, African American women were
almost universally in favor of gaining the vote to help end their sexual exploitation, promote their educational
opportunities and protect those who were wage earners.

Other states
One after another, western states granted the right of voting to their women citizens, the only opposition
being presented by the liquor interests and the machine politicians. New York joined the procession in 1917.

19th Amendment
Many groups were opposed to women's suffrage at the time. On January 12, 1915, a suffrage bill was
brought before the House of Representatives but was lost by a vote of 174 to 204. Again a bill was brought
before the House, on January 10, 1918. On the evening before President Wilson made a strong and widely
published appeal to the House to pass the bill. It was passed with one more vote than was needed to make
the necessary two-thirds majority. The vote was then carried into the Senate. Again President Wilson made
an appeal, and on September 30, 1918, the question was put to the vote, but two votes were lacking to make
the two-thirds majority. On February 10, 1919, it was again voted upon, and then it was lost by only one vote.
There was considerable anxiety among politicians of both parties to have the amendment passed and made
effective before the general elections of 1920, so the President called a special session of Congress, and a
bill, introducing the amendment, was brought before the House again. On May 21, 1919, it was passed, 42
votes more than necessary being obtained. On June 4, 1919, it was brought before the Senate, and after a
long discussion it was passed, with 56 ayes and 25 nays. It only remained that the necessary number of
states should ratify the action of Congress. Within a few days Illinois, Wisconsin and Michigan, their
legislatures being then in session, passed the ratifications. Other states then followed their examples, and
Tennessee was the last of the needed 36 states to ratify, in the summer of 1920. The 19th Amendment to the
Constitution was an accomplished fact, and the Presidential election of November 1920, was therefore the
first occasion on which women in all of America were allowed to exercise their right of suffrage. This had the
effect of overriding local laws which confined the right to vote to males only. However, even now some of
those laws are on the statute book.
Page 105

The New Woman

In 1920, after 72 years of struggle, American women received the right to vote. After the 19th Amendment passed,
reformers talked about female voters uniting to clean up politics, improve society, and end discrimination.

At first, male politicians moved aggressively to court the women's vote, passing legislation guaranteeing women's
rights to serve on juries and hold public office. Congress also passed legislation to set up a national system of
women's and infant's health care clinics, as well as a constitutional amendment prohibiting child labor--a measure
supported by many women's groups.

The early momentum quickly dissipated, however, as the women's movement divided within and faced growing
hostility from without. The major issue that split feminists during the 1920s was a proposed Equal Rights
Amendment to the Constitution outlawing discrimination based on sex. The issue pitted the interests of professional
women against those of working class women, many of whom feared that the amendment would prohibit "protective
legislation" that stipulated minimum wages and maximum hours for female workers.

The women's movement also faced mounting external opposition. During the Red Scare following World War I, the
War Department issued the "Spider Web" chart which linked feminist groups to foreign radicalism. Many feminist
goals were unachieved in the mid-1920s. Opposition from many Southern states and the Catholic Church defeated
the proposed constitutional amendment outlawing child labor. The Supreme Court struck down a minimum wage law
for women workers, while Congress failed to fund the system of health care clinics.

Women did not win new opportunities in the workplace. Although the American work force included eight million
women in 1920, more than half were black or foreign-born. Domestic service remained the largest occupation,
followed by secretaries, typists, and clerks--all low-paying jobs. The American Federation of Labor (AFL) remained
openly hostile to women because it did not want females competing for men's jobs. Female professionals, too, made
little progress. They consistently received less pay than their male counterparts. Moreover, they were concentrated
in traditionally "female" occupations such as teaching and nursing.

During the 1920s, the organized women's movement declined in influence, partly due to the rise of the new consumer
culture that made the suffragists and settlement house workers of the Progressive era seem old-fashioned.
Advertisers tried self-consciously to co-opt many of the themes of pre-World War I feminism, arguing that the
modern economy was filled with exciting and liberating opportunities for consumption. To popularize smoking among
women, advertisers staged parades down New York's 5th Avenue, imitating the suffrage marches of the 1910s in
which young women carried "torches of freedom"--cigarettes.
Page 106

Feminism Reborn

Women in 1960 played a limited role in American government. Although women comprised about half of the nation's
voters, there were no female Supreme Court justices, federal appeals court justices, governors, cabinet officers, or
ambassadors. Only 2 out of 100 U.S. senators and 15 out of 435 representatives were women. Of the 307 federal
district judges, 2 were women. Of the 7,700 members of state legislatures, 234 were women. Nor were these figures
atypical. Only two American women had ever been elected governor, only two had ever served in a president's
cabinet, and only six had ever served as ambassador.

Economically, women workers were concentrated in low-paying service and factory jobs. The overwhelming majority
worked as secretaries, waitresses, beauticians, teachers, nurses, and librarians. Only 3.5 percent of the nation's
lawyers were women, 10 percent of the nation's scientists, and less than 2 percent of the nation's leading business
executives.

Lower pay for women doing the same work as men was commonplace. One out of every three companies had
separate pay scales for male and female workers. A female bank teller typically made $15 a week less than a man
with the same amount of experience, and a female laundry worker made 49 cents an hour less than her male
counterpart. Altogether, the earnings of women working full-time averaged only about 60 percent of those of men.

In many parts of the country, the law discriminated against women. In three states--Alabama, Mississippi, and South
Carolina--women could not sit on juries. Many states restricted married women's right to make contracts, sell
property, engage in business, control their own earnings, and make wills. Six states gave fathers preference in the
custody of young children after a divorce. In practically every state, men had a legal right to have intercourse with
their wives and to administer an unspecified amount of physical punishment.

Women were often portrayed in the mass media in an unrealistic and stereotyped way. Popular magazines, like
Reader's Digest, and popular television shows like "I Love Lucy" often depicted women as stupid or foolish, jealous of
other women, irresponsible about money, and overly anxious to marry.

In December 1961, President John F. Kennedy placed the issue of women's rights on the national political agenda.
Eager to fulfill a debt to women voters--he had not named a single woman to a policymaking position--Kennedy
established a President's Commission on the Status of Women, the first presidential panel ever to examine the status
of American women. Chaired by Eleanor Roosevelt, the commission issued its report in 1963, the year that Betty
Friedan published The Feminine Mystique. The report's recommendations included a call for an end to all legal
restrictions on married women's right to own property, to enter into business, and to make contracts; equal
opportunity in employment; and greater availability of child-care services.

The most important reform to grow out of the commission's investigations was the 1963 Equal Pay Act, which
required equal pay for men and women who performed the same jobs under equal conditions. The Equal Pay Act was
the first federal law to prohibit discrimination on the basis of gender.

The next year, Congress enacted a new weapon in the fight against sex discrimination. Title VII of the 1964 Civil
Rights Act prohibited discrimination in hiring or promotion based on race, color, religion, national origin, or sex by
private employers and unions. As originally proposed, the bill only outlawed racial discrimination; but in a futile effort
to block the measure, Representative Howard Smith of Virginia amended the bill to prohibit discrimination on the
basis of sex. Some liberals opposed the amendment on the grounds that it diverted attention from racial
discrimination. But it passed in the House of Representatives, 168 votes to 133 votes. "We made it! God Bless
America!" shouted a female voice from the House gallery when the amendment passed.

The Civil Rights Act made it illegal for employers to discriminate against women in hiring and promotion unless the
employer could show that sex was a "bona fide occupational qualification" (for example, hiring a man as an attendant
for a men's restroom). To investigate complaints of employment discrimination, the act set up the Equal Employment
Opportunity Commission (EEOC).

At first, the EEOC focused its enforcement efforts on racial discrimination and largely ignored sex discrimination. To
pressure the EEOC to enforce the law prohibiting sex discrimination, Betty Friedan and 300 other women formed the
National Organization for Women (NOW) in 1966, with Friedan as president. The organization pledged "to take action
to bring women into full participation in the mainstream of American society now, exercising all the privileges and
responsibilities thereof in truly equal partnership with men." NOW filed suit against the EEOC "to force it to comply
Page 107

with its own government rules." It also sued the country's 1,300 largest corporations for sex discrimination, lobbied
President Johnson to issue an executive order that would include women within federal affirmative action
requirements, and challenged airline policies that required stewardesses to retire after they married or reached the
age of 32.
At its second national conference in November 1967, NOW drew up an eight-point bill of rights for women. It called
for adoption of an Equal Rights Amendment (ERA) to the Constitution, prohibiting sex discrimination; equal
educational, job training, and housing opportunities for women; and repeal of laws limiting access to contraceptive
devices and abortion.
Two proposals produced fierce dissension within the new organization. One source of disagreement was the Equal
Rights Amendment. The amendment consisted of two dozen words: "Equality of rights under the law shall not be
denied or abridged by the United States or by any state on account of sex." It had originally been proposed in 1923
to mark the 75th anniversary of the Seneca Falls Women's Rights Convention and was submitted to Congress at
almost every session. For over 40 years, professional women who favored the amendment battled with organized
labor and the Women's Bureau of the Labor Department, which opposed the amendment on the grounds that it
endangered "protective" legislation that set minimum wages and maximum hours for less-skilled women workers.

The other issue that generated controversy was the call for reform of abortion laws. In 1967, only one state
(Colorado) had reformed 19th-century legal statutes that made abortion a criminal offense. Dissenters believed that
NOW should avoid controversial issues that would divert attention away from economic discrimination.

Despite internal disagreements, NOW's membership grew rapidly, reaching 40,000 by 1974 and 175,000 by 1988.
The group broadened its attention to include such issues as the plight of poor and nonwhite women, domestic
violence, rape, sexual harassment, the role of women in sports, and the rights of lesbians. At the same time, the
organization claimed a number of achievements. Two victories were particularly important. In 1967, NOW persuaded
President Lyndon Johnson to issue Executive Order 11375, which prohibited government contractors from
discriminating on the basis of sex and required them to adopt "affirmative action" to ensure that women are properly
represented in their work force. The next year, the EEOC ruled that separate want ads for men and women were a
violation of Title VII of the 1964 Civil Rights Act
Page 108

Voting Rights

The 1964 Civil Rights Act prohibited discrimination in employment and public accommodations. But many African
Americans were denied an equally fundamental constitutional right, the right to vote. The most effective barriers to
black voting were state laws requiring prospective voters to read and interpret sections of the state constitution. In
Alabama, voters had to provide written answers to a 20-page test on the Constitution and state and local
government. Questions included: Where do presidential electors cast ballots for president? Name the rights a person
has after he has been indicted by a grand jury?

In an effort to bring the issue of voting rights to national attention, Martin Luther King, Jr. launched a voter
registration drive in Selma, Alabama, in early 1965. Even though blacks slightly outnumbered whites in the city of
29,500 people, Selma's voting rolls were 99 percent white and 1 percent black. For seven weeks, King led hundreds
of Selma's black residents to the county courthouse to register to vote. Nearly 2,000 black demonstrators, including
King, were jailed by County Sheriff James Clark for contempt of court, juvenile delinquency, and parading without a
permit. After a federal court ordered Clark not to interfere with orderly registration, the sheriff forced black
applicants to stand in line for up to five hours before being permitted to take a "literacy" test. Not a single black
voter was added to the registration rolls.

When a young black man was murdered in nearby Marion, King responded by calling for a march from Selma to the
state capitol of Montgomery, 50 miles away. On March 7, 1965, black voting-rights demonstrators prepared to
march. "I can't promise you that it won't get you beaten," King told them, "... but we must stand up for what is
right!" As they crossed a bridge spanning the Alabama River, 200 state police with tear gas, night sticks, and whips
attacked them. The march resumed on March 21 with federal protection. The marchers chanted: "Segregation's got
to fall ... you never can jail us all." On March 25 a crowd of 25,000 gathered at the state capitol to celebrate the
march's completion. Martin Luther King, Jr. addressed the crowd and called for an end to segregated schools,
poverty, and voting discrimination. "I know you are asking today, 'How long will it take?' ... How long? Not long,
because no lie can live forever."

Within hours of the march's end, four Ku Klux Klan members shot and killed a 39-year-old white civil rights volunteer
from Detroit named Viola Liuzzo. President Johnson expressed the nation's shock and anger. "Mrs. Liuzzo went to
Alabama to serve the struggle for justice," the President said. "She was murdered by the enemies of justice who for
decades have used the rope and the gun and the tar and the feather to terrorize their neighbors."

Two measures adopted in 1965 helped safeguard the voting rights of black Americans. On January 23, the states
completed ratification of the 24th Amendment to the Constitution barring a poll tax in federal elections. At the time,
five Southern states still had a poll tax. On August 6, President Johnson signed the Voting Rights Act, which
prohibited literacy tests and sent federal examiners to seven Southern states to register black voters. Within a year,
450,000 Southern blacks registered to vote.
Page 109

Women's Liberation

Hosted by Jack Bailey, a gravel-voiced former carnival barker, “Queen For A Day” was one of the most popular
daytime television shows of the 1950s. Five times a week, three women, each with a hard-luck story, recited their
tales of woe--diseases, retarded children, poverty. The studio audience, with the aid of an applause meter, would
then decide which woman had the greater misfortune. She became "queen for a day." Bailey put a crown on her
head, wrapped her in a mink coat (which she got to keep for 24 hours), and told her about the new Cadillac she
would get to drive (also for the next 24 hours). Then, the queen was presented with gifts: a year's supply of Helena
Rubinstein cosmetics; a Clairol permanent and once-over by a Hollywood makeup artist; and the electric appliances
necessary for female happiness--a toaster oven, an automatic washer and dryer, and an iron. The gifts provided
everything a woman needed to be a prettier and better housewife.

One woman in the television audience was Betty Friedan. A 1942 honors graduate of Smith College and former
psychology Ph.D. candidate at the University of California at Berkeley, Friedan had quit graduate school, married,
moved to the New York suburbs, and bore three children in rapid succession. American culture told her that husband,
house, children, and electric appliances were true happiness. But Friedan was not happy. And she was not alone.

In 1957, Friedan sent out questionnaires to fellow members of her college graduating class. The replies amazed her.
Again and again, she found women suffering from "a sense of dissatisfaction." Over the next five years, Friedan
interviewed other women at PTA meetings and suburban cocktail parties, and she repeatedly found an unexplainable
sense of melancholy and incompleteness. Friedan noted, "Sometimes a woman would say 'I feel empty somehow ...
incomplete.' Or she would say, 'I feel as if I don't exist.'" Friedan was not the only observer to detect a widespread
sense of discontent among American women.

Doctors identified a new female malady, the housewife's syndrome, characterized by a mixture of frustration and
exhaustion. CBS broadcast a television documentary entitled "The Trapped Housewife." Newsweek magazine noted
that the nation's supposedly happy housewife was "dissatisfied with a lot that women of other lands can only dream
of. Her discontent is deep, pervasive, and impervious to the superficial remedies which are offered at every hand."
The New York Times editorialized, "Many young women ... feel stifled in their homes." Redbook magazine ran an
article entitled "Why Young Mothers Feel Trapped" and asked for examples of this problem. It received 24,000
replies.

”Why” Friedan asked, “were American women so discontented?” In 1963, she published the answer in her book, The
Feminine Mystique. This book, one of the most influential books ever written by an American, helped to launch a new
movement for women's liberation. The book touched a nerve, but the origins of the movement lay in the role of
females in American society.
Page 110

Women's Rights

At the outset of the century, women could not vote or hold office in any state, they had no access to higher
education, and they were excluded from professional occupations. American law accepted the principle that a wife
had no legal identity apart from her husband. She could not be sued, nor could she bring a legal suit; she could not
make a contract, nor could she own property. She was not permitted to control her own wages or gain custody of
her
children in case of separation or divorce.

Broad social and economic changes, such as the development of a market economy and a decline in the birthrate,
opened employment opportunities for women. Instead of bearing children at two-year intervals after marriage, as
was the general case throughout the colonial era, during the early 19th century women bore fewer children and
ceased childbearing at younger ages. During these decades the first women’s college was established, and some
men’s colleges first opened their doors to women students. More women were postponing marriage or not marrying
at all; unmarried women gained new employment opportunities as “mill girls” and elementary school teachers; and a
growing number of women achieved prominence as novelists, editors, teachers, and leaders of church and
philanthropic societies.

Although there were many improvements in the status of women during the first half of the century, women still
lacked political and economic status when compared with men. As the franchise was extended to larger and larger
numbers of white males, including large groups of recent immigrants, the gap in political power between women and
men widened. Even though women made up a core of supporters for many reform movements, men excluded them
from positions of decision making and relegated them to separate female auxiliaries. Additionally, women lost
economic status as production shifted away from the household to the factory and workshop. During the late 18th
century, the need for a cash income led women and older children to engage in a variety of household industries,
such as weaving and spinning. Increasingly, in the 19th century, these tasks were performed in factories and mills,
where the workforce was largely male.

The fact that changes in the economy tended to confine women to a sphere separate from men had important
implications for reform. Since women were believed to be uncontaminated by the competitive struggle for wealth and
power, many argued that they had a duty--and the capacity--to exert an uplifting moral influence on American
society. Catharine Beecher (1800–1878) and Sarah J. Hale (1788–1879) helped lead the effort to expand women’s
roles through moral influence. Beecher, the eldest sister of Harriet Beecher Stowe, was one of the nation’s most
prominent educators before the Civil War. A woman of many talents and strong leadership, she wrote a highly
regarded book on domestic science and spearheaded the campaign to convince school boards that women were
suited to serve as schoolteachers. Hale edited the nation’s most popular women’s magazines, the Ladies Magazine
and Godey’s Ladies Book. She led the successful campaign to make Thanksgiving a national holiday (during Lincoln’s
administration), and she also composed the famous nursery rhyme “Mary Had a Little Lamb.”

Both Beecher and Hale worked tirelessly for women’s education (Hale helped found Vassar College). They gave voice
to the grievances of women--abysmally low wages paid to women in the needle trades (12.5 cents a day), the
physical hardships endured by female operatives in the nation’s shops and mills (where women worked 14 hours a
day), and the minimizing of women’s intellectual aspirations. Even though neither woman supported full equal rights
for women, they were important transitional figures in the emergence of feminism. Each significantly broadened
society’s definition of “women’s sphere” and assigned women vital social responsibilities: to shape their children’s
character, morally to uplift their husbands, and to promote causes of “practical benevolence.”

Other women broke down old barriers and forged new opportunities in a more dramatic fashion. Frances Wright
(1795–1852), a Scottish-born reformer and lecturer, received the nickname “The Great Red Harlot of Infidelity”
because of her radical ideas about birth control, liberalized divorce laws, and legal rights for married women. In 1849
Elizabeth Blackwell (1821–1910) became the first American woman to receive a degree in medicine. A number of
women became active as revivalists. Perhaps the most notable was Phoebe Palmer (1807–1874), a Methodist
preacher who ignited religious fervor among thousands of Americans and Canadians.

Catalyst for Women’s Rights

A public debate over the proper role of women in the antislavery movement, especially their right to lecture to
audiences composed of both sexes, led to the first organized movement for women’s rights. By the mid-1830s more
than a hundred female antislavery societies had been created, and women abolitionists were circulating petitions,
editing abolitionist tracts, and organizing antislavery conventions. A key question was whether women abolitionists
Page 111

would be permitted to lecture to “mixed” audiences of men and women. In 1837 a national women’s antislavery
convention resolved that women should overcome this taboo: “The time has come for women to move in that sphere
which providence has assigned her, and no longer remain satisfied with the circumscribed limits which corrupt
custom and a perverted application of Scripture have encircled her.”
Angelina Grimké (1805–1879) and her sister Sarah (1792–1873)--two sisters from a wealthy Charleston, South
Carolina, slaveholding family--were the first women to break the restrictions and widen women’s sphere through
their writings and lectures before mixed audiences. In 1837 Angelina gained national notoriety by lecturing against
slavery to audiences that included men as well as women. Shocked by this breach of the separate sexual spheres
ordained by God, ministers in Massachusetts called on their fellow clergy to forbid women the right to speak from
church pulpits. Sarah Grimké in 1840 responded with a pamphlet entitled Letters on the Condition of Women and the
Equality of the Sexes, one of the first modern statements of feminist principles. She denounced the injustice of lower
pay and denial of equal educational opportunities for women. Her pamphlet expressed outrage that women were
“regarded by men, as pretty toys or as mere instruments of pleasure” and were taught to believe that marriage is
“the sine qua non [indispensable element] of human happiness and human existence.” Men and women, she
concluded, should not be treated differently, since both were endowed with innate natural rights.

In 1840, after the American Anti-Slavery Society split over the issue of women’s rights, the organization named
three female delegates to a World Anti-Slavery Convention to be held in London later that year. There, these women
were denied the right to participate in the convention on the grounds that their participation would offend British
public opinion. The convention relegated them to seats in a balcony.
Eight years later, Lucretia Mott (1793–1880), who earlier had been denied the right to serve as a delegate to the
World Anti-Slavery Convention, and Elizabeth Cady Stanton (1815–1902) organized the first women’s rights
convention in history. Held in July 1848 at Seneca Falls, New York, the convention drew up a Declaration of
Sentiments, modeled on the Declaration of Independence, which opened with the phrase “All men and women are
created equal.” It named 15 specific inequities suffered by women, and after detailing “a history of repeated injuries
and usurpations on the part of men toward woman,” the document concluded that “he has endeavored, in every way
that he could, to destroy her confidence in her own powers, to lessen her self-respect, and to make her willing to
lead a dependent and abject life.”

Among the resolutions adopted by the convention, only one was not ratified unanimously--that women be granted
the right to vote. Of the 66 women and 34 men who signed the Declaration of Sentiments at the convention
(including black abolitionist Frederick Douglass), only two lived to see the ratification of the women’s suffrage
amendment to the Constitution 72 years later.

By mid-century women’s rights conventions had been held in every northern state. Despite ridicule from the public
press--the Worcester (Massachusetts) Telegraph denounced women’s rights advocates as “Amazons”--female
reformers contributed to important, if limited, advances against discrimination. They succeeded in gaining adoption
of Married Women’s Property Laws in a number of states, granting married women control over their own income and
property. A New York law passed in 1860 gave women joint custody over children and the right to sue and be sued,
and in several states women’s rights reformers secured adoption of permissive divorce laws. A Connecticut law, for
example, granted divorce for any “misconduct” that “permanently destroys the happiness of the petitioner and
defeats the purposes of the marriage relationship.”

Black women, too, were active in the campaign to extend equal rights to women. One of the most outspoken
advocates for both women’s rights and abolition was Sojourner Truth, born a slave known as Isabella in New York
State’s Hudson River Valley around 1797. She escaped from bondage in 1826, taking refuge with a farm family that
later bought her freedom. She took the name Sojourner Truth in 1843, convinced that God had called on her to
preach the truth throughout the country. Her fame as a preacher, singer, and orator for abolition and women’s rights
spread rapidly. At a women’s rights convention in Akron, Ohio, in 1851, she is reported to have demanded that
Americans recognize the African American women’s right to equality. “I could work as much and eat as much as a
man--when I could get it--and bear de lash as well!” she told the crowd. “And ain’t I a woman?”

During the Civil War, Truth supported the Union, collecting food and supplies for black troops and struggling to make
emancipation a war aim. When the war was over, she traveled across the North, collecting signatures on petitions
calling on Congress to set aside western lands for former slaves. At her death in 1883, she could rightly be
remembered as one of the nation’s most eloquent opponents of discrimination in all forms.

You might also like