Where can buy Through a Glass Brightly: Using Science to See Our Species as We Really Are David Philip Barash ebook with cheap price
Where can buy Through a Glass Brightly: Using Science to See Our Species as We Really Are David Philip Barash ebook with cheap price
Where can buy Through a Glass Brightly: Using Science to See Our Species as We Really Are David Philip Barash ebook with cheap price
com
https://textbookfull.com/product/through-a-glass-brightly-
using-science-to-see-our-species-as-we-really-are-david-
philip-barash/
OR CLICK BUTTON
DOWNLOAD NOW
https://textbookfull.com/product/approaches-to-peace-a-reader-in-
peace-studies-david-philip-barash/
textboxfull.com
Who We Are and How We Got Here: Ancient DNA and the new
science of the human past David Reich
https://textbookfull.com/product/who-we-are-and-how-we-got-here-
ancient-dna-and-the-new-science-of-the-human-past-david-reich/
textboxfull.com
https://textbookfull.com/product/we-are-not-yet-equal-understanding-
our-racial-divide-1st-edition-carol-anderson/
textboxfull.com
We Are Data Algorithms And The Making Of Our Digital
Selves John Cheney-Lippold
https://textbookfull.com/product/we-are-data-algorithms-and-the-
making-of-our-digital-selves-john-cheney-lippold/
textboxfull.com
https://textbookfull.com/product/when-we-are-no-more-how-digital-
memory-is-shaping-our-future-first-edition-rumsey/
textboxfull.com
https://textbookfull.com/product/why-are-we-conscious-a-scientists-
take-on-consciousness-and-extrasensory-perception-1st-edition-david-e-
h-jones/
textboxfull.com
“A ruthless and witty debunking of self-flattering illusions held by man over millennia
that nonetheless leaves the reader feeling oddly hopeful, and almost giddy. Who knew
science could be so much fun?”—Rick Shenkman, author of Political Animals: How
Our Stone-Age Brain Gets in the Way of Smart Politics.
“A refreshing, revelatory and poignant look at the fundamental faults of our species,
that also explains our inability to make the bold decisions ensuring the long-term
survival of planet Earth. A must-read for anyone who struggles to comprehend our
species and its disregard for the natural world and the impact and consequences of
our collective and wasteful existence.”—Louise Leakey, Paleontologist and Research
Professor, Turkana Basin Institute, Stony Brook University.
“You’d think Copernicus and Darwin would have sufficed to get humanity over its
superiority complex, but we are still in the middle of shaking it off. David Barash
enlightens us from a solid historical and scientific perspective how far we have come
and how far we still have to go.”—Frans de Waal, author of Are We Smart Enough to
Know How Smart Animals Are?
“There could hardly be a more timely and urgent issue than the role of scientific inquiry
in determining what makes humans human and our proper place in and relationship
b
to nature. In lucid prose that explains the scientific method to anyone who cares about
the difference between facts and fantasy, David Barash explores the psychological,
social, and physical perils that are inevitable when human beings regard themselves as
being above nature rather than a part of nature. This is a splendid tribute to a human
specialness that depends not on having been created by a divine being but on our
willingness to use reason to deal wisely with the rest of nature. Every literate politician
in Washington should read this book.”—Susan Jacoby, author of The Age of American
Unreason in a Culture of Lies.
“David Barash confronts with friendly erudition and gorgeous range the matter of
what is human nature and why humans fight the facts of complicated life so eagerly.
He does so with kind verve and a responsible salute to the endless role of science and
literature, its mate in seeking meaning.”—Lionel Tiger, Charles Darwin Professor of
Anthropology Emeritus, Rutgers University, and author, most recently of The Decline
of Males and God’s Brain.
“This engaging, energizing and enlightening treatise on man’s place in nature goes a
long way towards reminding all humanity that we are part of the natural world. But
it issues a warning as well: if modern humans continue to ignore this simple fact, it
will be at our peril.”—Donald C. Johanson, Discoverer of “Lucy” and Founder of the
Institute of Human Origins, Arizona State University.
i
David P. Barash
1
ii
1
Oxford University Press is a department of the University of Oxford. It furthers
the University’s objective of excellence in research, scholarship, and education
by publishing worldwide. Oxford is a registered trade mark of Oxford University
Press in the UK and certain other countries.
1 3 5 7 9 8 6 4 2
Printed by Sheridan Books, Inc., United States of America
iii
Contents
Index 195
iii
iv
v
Part I
The Allure of Human
Centrality, or, How We
Persistently Try to Deny
Our Place in the Natural
World—and Fail
2
Prelude to Part I
Being kicked out of paradise must have been tough on Adam and Eve. The require-
ment to earn food by the sweat of their brow was doubtless bad enough, not to mention
pain in childbirth, but losing immortality and being told that dust they are and to dust
will they return, must have been—in modern parlance—a real bummer.a
In any event, those particular losses happened long ago, so no one around today has
directly experienced the trauma. By contrast, in recent time we human beings have
been deprived of some of our most beloved and comforting Edenic myths, with others
dropping out almost daily, leaving us to confront a growing array of paradigms lost,
many of them variants on this fact, easy to state, hard to accept: we are not as impor-
tant, as special, as all-around wonderful as we’d like.
Confronting this reality—more accurately, trying to negotiate around it—engages
a deep-rooted tendency whereby people begrudgingly accept what is forced on them,
while nonetheless clinging to their most cherished preexisting beliefs, what they want
to be true. Prominent current cases include granting that the Earth’s climate is heating
up but refusing to accept that human beings are responsible, or acknowledging that
evolution is real when it comes, for example, to microbial antibiotic resistance but
denying that it produced us.
It is an inclination that is evidently deep-rooted in the human psyche. Thucydides,
fifth-century BC historian of the Peloponnesian War, complained of his contempo-
raries that “their judgment was based more upon blind wishing than upon any sound
prevision; for it is a habit of mankind to entrust to careless hope what they long for, and
to use sovereign reason to thrust aside what they do not fancy.”
It is difficult to modify an opinion once established, especially if it’s a favorable
one—and even more so if it is centered on one’s self (or one’s species). A particular
consequence of intellectual progress has nonetheless been an understanding of our
increasingly deflated place in nature and in the universe, making it more and more
a
As a biologist, I confess to an added personal regret that the serpent was punished by being
forced to crawl on its belly, since I would have loved to see how it might have ambulated other-
wise: perhaps bouncing on its tail as on a pogo stick.
2
3
untenable to see ourselves as somehow outside of—never mind superior to—the rest of
“creation.” Through a Glass Brightly therefore outlines a less grandiose but more brac-
ingly accurate view of ourselves, thanks to modern science.
It is the height of paradox. The more we learn about our own species and the world
around us, the more we are humbled, forced to relinquish some of our most cherished
illusions, especially our unique centrality. This demotion was given a dramatic push
by Copernicus, Kepler, and Galileo, who upended the Ptolemaic view that the cosmos
revolves around a central and immobile planet Earth. It is difficult for us in the twenty-
first century to appreciate how troublesome, even painful, it was for our home—and
by extension, ourselves as well—to be so irrefutably downgraded.b Emblematic of this,
the sixteenth-century Danish astronomer Tycho Brahe (one of the finest scientists
of his day and probably the greatest naked-eye astronomer of all time) proposed an
alternative to the Copernican system. According to Brahe’s paradigm, the five known
planets—Mercury, Venus, Mars, Jupiter, and Saturn—all circled the Sun, but that con-
glomeration in turn revolved around a central Earth. Many astronomers note, inci-
dentally, that Brahe’s proposed system was in fact a good fit with the data available to
him, and that his “blunder” wasn’t so much a result of prevailing religious belief as an
understandable reluctance to discard the reigning Earth-centered system and replace
it with the newer solar one unless the evidence was indisputable.
Such an adjustment was, however, ultimately necessary, and although immensely
consequential, it was part of an even broader and deeper substitution, what the pio-
neering sociologist Max Weber called the “disenchantment of the world,” exemplified
by Galileo’s more general discovery that the world lends itself to material explana-
tions: objective as distinct from subjective, natural rather than supernatural. In The
Myth of the Machine, the historian Lewis Mumford complained,
Galileo committed a crime far greater than what any dignitary of the Church
accused him of; for his real crime was trading the totality of human experi-
ence for that minute portion which can be observed and interpreted in terms
of mass and motion . . . . In dismissing human subjectivity Galileo had excom-
municated history’s central subject, multi-dimensional man . . . . Under the new
scientific dispensation . . . all living forms must be brought into harmony with
the mechanical world picture by being melted down, so to say, molded anew to
conform to a more mechanical model.1
b
It has been argued, by the way, that not all contemporary theologians and philosophers felt that
the center of the universe was such a good place to be. Thus, the center of the Earth was widely
considered to be the abode of hell, and the center of the universe, not much better.
It is said that after being forced to recant his claim that the Earth moves around the
sun, Galileo muttered to himself, “E pur si muove” (And yet it moves). The story might
be apocryphal, but the underlying mechanical model, the cosmic machine of which
everyone and everything is a part, is no myth. Nor is the resistance that it continues
to evoke.
After the Copernican revolution and the one that Galileo initiated (which is still a
work in progress) came the Darwinian revelation that we, along with the rest of the
living world, aren’t the products of Special Creation, but rather the results of a natural,
material process that physically connects us to all other organisms. Even now, oppo-
nents of evolution cling desperately to the illusion that human beings—and, in some
cases, living things generally—are so special that only a benevolent Creator could have
produced them. For these people, it remains a hard sell that the organic world, like the
sun and its five planets, doesn’t revolve around us.
The third major leg of this troublesome triad was initiated by Freud, who (despite
his occasional crackpot flights of fancy) came up with at least one solid and highly con-
sequential discovery: the existence of the unconscious. Regardless of what one thinks
of “penis envy,” the “Oedipus complex,” and so forth, there is general agreement that
the human mind is like an iceberg, with much of its mass hiding below the conscious
waterline.
So, not only have we been kicked out of our presumed astronomical centrality,
immersed in a world of materiality and deprived of our widely assumed creaturely
uniqueness, but we aren’t even masters in what seemed to be left to us, our pride and
joy: our rational, conscious minds.
Of course, there are many people for whom the more we learn about the natural
world, the more wonderful it is revealed to be, and thus, the more magnificent its
Creator. It is likely, nonetheless, that insofar as human beings are perceived as “natu-
ral,” and thus explicable in terms of widely accepted scientific principles rather than
uniquely fashioned by supernatural intervention, the more resistance will be evoked
among those committed not just to human specialness but also to perceiving this spe-
cialness as evidence for divine power and intervention. It is hard enough to adjust your
opinion—think of how much easier it is to change your clothes than to change your
mind—harder yet to relinquish a cherished perspective. Especially one that has the
blessing of religious belief. As Jonathan Swift noted centuries ago in his essay, Seeking
Wisdom, “You cannot reason a person out of a position he did not reason himself into
in the first place.”
The only constant, nevertheless, is change. The story is told of an ancient ruler who
tasked his advisers to come up with a statement that would be true at all times and for
all occasions. Their response: “This too shall pass.” But although the world’s factual
details are constantly shifting (as the philosopher Heraclitus pointed out, you cannot
4 Prelude TO PA RT I
5
step in the same river twice, and, as Buddhists note, all things are impermanent), the
basic rules and patterns underlying these changes in the physical and biological world
are themselves constant. So far as we know, light traveled at the same speed during
the age of dinosaurs, during the Peloponnesian War, and today. The Second Law of
Thermodynamics is true and was true long before Carnot discovered this principle,
just as special and general relativity was valid before being identified by Einstein.
Compared to the apparently unchanging nature of physical law, our insights are
always “evolving,” along with living things themselves, although recognizing and
understanding these insights often requires a major paradigm shift. Interestingly,
although much has been learned (and more yet, hypothesized!) about how science
proceeds to generate reliable knowledge, relatively little is known about how and why
people—including scientists themselves—change their personal beliefs. On the one
hand, we have Max Planck’s famous quip, “A new scientific truth does not triumph
by convincing its opponents and making them see the light, but rather because its
opponents eventually die, and a new generation grows up that is familiar with it.” And
on the other, the more optimistic and probably widespread notion that, eventually, the
truth will out.
To be clear, I am not claiming that clinging to factual error is necessarily the result of
benighted religious prejudice or the simple psychology of denial. Sometimes, incorrect
scientific ideas enjoy popularity because they are a good fit with current empirical data.
Initially, nearly all competent astronomers resisted Copernicus’s model, at least in part
because it didn’t accord any better with astronomic observations than did the regnant
Ptolemaic one. However, at least some of that resistance was due, as well, to the painful
emotional and theological reorientation necessitated by its acceptance.
“All truths are easy to understand once they are discovered,” wrote Galileo. “The point is
to discover them.”2 Much as I revere Galileo, I am not at all sure that in this regard he was
correct. Sometimes, the problem isn’t simply to discover truths but to accept them, which
is especially difficult when such acceptance requires overcoming the bias of anthropocen-
trism, whereby people put their own species at the center of things. Although my hope is
that seeing Homo sapiens through the bright glass of science will contribute to human-
ity understanding and accepting itself, given the stubborn persistence of anthropocentric
thinking, I cannot promise success. The writings of the “new atheists” offer a possible par-
allel: Dawkins, Harris, Dennett, and Hitchens do not appear to have converted people to
atheism so much as they have helped initiate a discussion, such that even though atheism
is not nearly (yet?) mainstream, it has become more respectable.
Thomas Kuhn famously suggested that each science operates within its own para-
digm, which limits the ability of its practitioners to conceive other approaches—
until a “paradigmatic revolution” supplants the prior intellectual system with a new
one, which in turn, is similarly limiting. A related problem is that of “unconceived
6 Prelude TO PA RT I
7
brain which, compared to that of a person, is primitive.” In her book, The Incredible
Unlikeliness of Being: Evolution and the Making of Us, Alice Roberts5 points out, “All
tetrapods around today have five digits or fewer at the end of their limbs. So it seems
reasonable to assume that we’ve all descended from a five-fingered, or pentadactyl,
ancestor.” Accordingly, at least with respect to our toes and fingers, we are primitive
rather than advanced. On the other hoof, by contrast, a horse’s toe, at the end of each
limb, consists of but a single digit, making it more advanced than ourselves, at least
when it comes to its tip-toe middle digit—technically the digitus impudicus—whereby
Equus caballus is more different from the ancestral vertebrate condition than we are.
(This also means, Dr. Roberts notes, that horses are walking around giving us the
finger.)
In most other respects, our demotion—more accurately, our inclusion in the mate-
rial bestiary of the real world—courtesy of science, is not only long overdue but some-
what more serious. When Carl Sagan famously informed his television audience that
we are all made of “star stuff,” the deeper implications may well have been lost on many
of his fellow star-stuffed critters. Please meditate, for a moment, on the fact that there
is literally nothing special about the atoms of which everyone is composed. Even in
their statistical preponderance by mass, these elements reflect rather well the chemical
composition of the universe as a whole: oxygen, carbon, hydrogen, nitrogen, calcium,
and so forth. Of course, there is something special about the way these common com-
ponents are arranged; that’s the work of natural selection, which, when presented with
alternatives, multiplied and extended the frequency of those combinations that were
comparatively successful in replicating themselves. All this, in turn, further highlights
the degree to which we are cut from the same cloth.
Recall Socrates’s dictum, “The unexamined life is not worth living.” The issue, with
respect to the present book, is not so much examining your own life, or human life
generally, but rather, understanding both and doing so with humility, honesty, and
an expanded sense of interconnectedness and potential. According to the King James
Version of the Bible, in 1 Corinthians 13:12, Paul wrote, “For now we see through a
glass, darkly,” an observation that—suitably modified—led to the title of the present
book. Paul went on to write that after this restricted, darkened field of vision, we could
look forward, upon meeting God, to seeing “face to face,” adding, “now I know in part;
but then shall I know even as also I am known.” Fine for believers, but for the secular-
ists among us, there is even better news: through the glass of science, we can all know
and be known, and see brightly, not in heaven but here and now.
Yet there is some wisdom in Paul’s “darkly,” namely that we don’t necessarily see the
world with perfect accuracy. Why not? Because we haven’t evolved to do so. The fact
that we can penetrate some of the universe’s deeper secrets, unravel our own DNA,
and so forth, is remarkable, but not literally miraculous. Just as the human nose didn’t
evolve to hold up eyeglasses, but does a good job at it, and binocular vision evolved
to enable our arboreal primate ancestors to navigate their three-dimensional lives and
has subsequently done a good job enabling us to throw objects accurately, drive cars,
and pilot airplanes, our five senses along with our cognitive complexity and sophistica-
tion evolved for many possible reasons, including navigating an increasingly complex
and sophisticated social life, engaging in elaborate communication skills, making and
manipulating tools and other devices, predicting the future, and so forth.
Once it became part of our armamentarium, human intelligence and perception has
underwritten all sorts of additional activities, such as exploring the universe as well as
our own genome and composing symphonies and epic poems; the list is nearly endless,
but the basic point is that we didn’t evolve with an explicit adaptive capacity to do these
things. They were repurposed from neuronal structures and capabilities that emerged
for other reasons, not unlike pedestrian curb cuts that have been engineered to per-
mit wheelchair access from street to sidewalk, but are now used at least as much by
bicyclists and skateboarders. The biological reality is that our perceived separateness
may well have evolved so as to promote the success of our constituent genes, but at the
same time, there was little or no evolutionary payoff in recognizing not so much our
limitations as our lack thereof.
John Milton wrote Paradise Lost to “justify God’s ways to man.” In the end, what
justifies science to men and women is something more valuable and, yes, even more
poetic than Milton’s masterpiece or Paul’s vision: the opportunity to consume the fruits
of our own continually reevaluated, deeply rooted, admittedly imperfect, and yet pro-
foundly nourishing Tree of Scientific Knowledge, whereby we increasingly understand
ourselves as we really are. I hope that most people will find more pleasure than pain
in using science to do so, and in the process, seeing themselves and their species more
accurately and honestly—more brightly, in every sense of that word—than ever before.
Since this hope might well seem overly optimistic—even downright smug—this
is a good time to introduce something of a counternarrative, a brief meditation on
Piss-Poor Paradigms Past: examples of received wisdom that, in their time, went
pretty much unquestioned, even among those constituting the scientific establish-
ment. My purpose here is not to cast doubt or aspersions on the scientific enterprise.
Quite the opposite. It is to remind the reader that science is an ongoing process, and
that whereas the Tree of Scientific Knowledge is a many splendored thing, it also
consists of many branches that have ultimately proven to be weak—in some cases,
perilously so.
Ironically, some people lose faith in science because of the regular revisions it under-
goes, the irony being that it is precisely because science is constantly being revised that
we are getting closer and closer to what we can unblushingly call the truth. In short,
“what makes science right is the enduring capacity to admit we are wrong.”6
And there is no doubt that wrong has happened; science, or what used to pass for
science, has undergone much pruning, in the course of which the following limbs
(once thought strong) are among the many that have been amputated: vitalism (the
idea that living things possess some sort of unique life force or “élan vital”), spontane-
ous generation (rats and maggots emerge from garbage, etc.), confidence that alchemy
would enable its practitioners to turn base metals into gold, and widespread and stub-
born belief in weird substances such as luminiferous aether, phlogiston, and caloric.
In retrospect, these now discredited concepts, which seem downright foolish via
20-20 hindsight, were reasonable in their day. Take the aether, which was seen as nec-
essary to understand the otherwise mysterious behavior of light. So clear-cut was its
apparent legitimacy that James Clerk Maxwell—probably the greatest physicist of the
nineteenth century, and whose equations for electromagnetism are still fundamental
today—asserted that of all theoretical concepts in physics, the aether was the most
securely confirmed. In agreement were two of Maxwell’s most notable contemporary
physicists: Lord Kelvin and Heinrich Hertz. The latter’s research on the propagation of
radio waves had given further credence to the consensus that aether was needed as a
substance through which both light and radio waves were transmitted.
For centuries, scientists also assumed the dogma of an unchanging Earth and a
solid-state universe—now dramatically replaced by continental drift and the Big Bang,
respectively. Britain’s renowned astronomer-royal Fred Hoyle coined the phrase “Big
Bang” as a sarcastic response to what he perceived as a ludicrous alternative to the
then-regnant concept of an unchanging cosmos. Now the Big Bang is received wis-
dom, along with the finding that there are signs of prior water on Mars, but no arti-
ficial canals, the existence of which was claimed by Percival Lowell, another famous
astronomer.
Some of the most dramatic scientific paradigm shifts have involved biomedicine.
Consider, for example, the long-standing insistence that there are four humors—blood,
yellow bile, black bile, and phlegm, corresponding, it was thought, to human tempera-
ments: sanguine, choleric, melancholic, and (no surprise here) phlegmatic, respectively.
And don’t forget bloodletting as a widely acknowledged and scientifically “proven” medical
treatment, now known to have hastened George Washington’s death and long practiced
through much of the Western world. (The term “leech,” historically applied to physicians,
didn’t derive from their presumed avariciousness, but rather, from the use of blood-sucking
leeches as an instrument for ostensibly therapeutic exsanguination.)
Thanks to Pasteur, Koch, Lister, and other pioneering microbiologists, we have come
to understand the role of pathogens in causing disease, resulting in the scientific dis-
covery that “germs are bad.” This particular paradigm—displacing belief in “bad air”
and the like (“influenza” derives from the supposed “influence” of miasmas in causing
disease)—was vigorously resisted by the medical establishment. Doctors who would
over the health consequence of dietary cholesterol, red wine, caffeine, and so forth.
A cartoon in The New Yorker showed a large, glowering, shapeless something-or-other
poised outside a bakery, with the caption reading “The Gluten’s back. And it’s pissed.”
12 Prelude TO PA RT I
13
exceptionally, and altogether wonderfully large. But as Cartmill pointed out, the
weight of the Homo sapiens brain (1–2 kg) bumped up against the awkward fact that
the brains of elephants are larger (5–6 kg), and those of whales (up to 7 kg) are larger
yet. This unwanted and uncomfortable reality brought forth a focus on relative brain
size—comparing species by looking at brain weight in proportion to body weight.
Gratifyingly, it happens that this number is substantially higher for Homo sapiens
(1.6%–3.0%) than for elephants (0.09%) or whales (0.01%–1.16%). So far, so good.
Cartmill noted, however, that even in the realm of relative brain size, we are equaled
or exceeded by that of many small mammals, including squirrel monkeys (2.8%–
4.0%), red squirrels (2.0%–2.5%), chipmunks (3.0%–3.7%), and jumping mice (3.4%–
3.6%). And so, “algometric analysis” was then “invoked to rescue the axiom of human
cerebral preeminence. The first step in such an analysis is to assume that the interspe-
cific regression of the logarithm of brain weight on that of body weight ought to be a
straight line.” Without getting into the details of algometric analysis, suffice it to say that
even with this mathematical adjustment, porpoises ended up being “embarrassingly”
close to human beings and so another way out was needed. What about assuming that
brain size should be proportional to an organism’s total metabolic energy expenditure,
that is, looking at the amount of energy invested in each creature’s brain in propor-
tion to its total energy budget? Sure enough, if we obtain a measure of total metabolic
expenditure, by multiplying body weight times baseline metabolic rate, it turns out
that porpoises invest proportionately less energy in brain maintenance than do human
beings. Even in this case, however, there is a problem, since as Cartmill observed, it is
“a maneuver that a lizard might with equal justice use to prove that mammals don’t
really have bigger brains than reptiles, but only higher metabolic rates.”
The above brain brouhaha doesn’t even touch the case of learning capacities among
insects, whose brains are small indeed: fruit flies average only about 250,000 neurons
per brain, and yet they are capable of learning to avoid certain stimuli and to seek
out others, to orient themselves via a mental map of their surroundings, and so forth.
Moreover, bumblebees—which have approximately 1 million neurons in their brains
(a gratifyingly small number compared to mammals)—have recently been shown
capable of learning to do something unlike any behavior they are likely to encounter in
nature, namely to roll a little ball into the center of a platform in order to receive a small
dose of sugar water. Not only that, but individual bumblebees also learn this relatively
complex and heretofore unfamiliar behavior more rapidly if given the opportunity to
watch other bees learning the task.9 “Observational learning” of this sort had previ-
ously been considered a sign of higher mental powers, especially found in, well, us.
Writing about shared “intellectual faculties,” Darwin conceded in his 1871 book,
The Descent of Man, and Selection in Relation to Sex, “Undoubtedly, it would have been
very interesting to have traced the development of each separate faculty from the state
in which it exists in the lower animals to that in which it exists in man; but neither my
ability nor knowledge permit the attempt.” A lot has happened in the intervening time,
and although the evidence is accumulating rapidly, it is also resisted by many—and
not just religious fundamentalists and spokespeople for the beef and dairy industries.
The struggle against recognizing mental continuity between humans and other ani-
mals has taken place in many domains, including, for example, language, the meaning
of which has regularly been revised whenever detailed research revealed that nonhu-
man animals possessed it. Once it became evident that other creatures communicated
sophisticated information to each other (such as the “dance of the bees,” whereby a
forager communicates complex information about the location and even the desir-
ability of a food source to her hive-mates) language was redefined as synonymous with
something else: the establishment of arbitrary signs, such as the word “dance” mean-
ing a pattern of complex, rhythmic movements, as opposed to whatever is involved in
doing any particular kind of dance.
The influential cultural anthropologist Leslie White spoke for many when he
asserted,
The lower animals may receive new values, may acquire new meanings, but
they cannot create and bestow them. Only man can do this . . . . And this dif-
ference is one of kind, not of degree . . . . Because human behavior is symbol
behavior and since the behavior of infra-human species is non-symbolic, it fol-
lows that we can learn nothing about human behavior from observations upon
or experiments with the lower animals.10
Then, it having been demonstrated that some animals are in fact capable of assign-
ing meaning to arbitrary signals, a movement began to identify not signs, but syntax as
the sine qua non of “real” language—which is to say, what human beings do.
Note the distinct echoes of Tycho Brahe, developing new and creative ways to
retain humanity’s special place. The persistent search for human exceptionalism
whereby our biology renders us discontinuous from other animals is, if not quite a
fool’s errand, one persistently undertaken by a subset of Homo sapiens who—so long
as they base their search on science rather than metaphysics or theology—are doomed
to disappointment.
The best view in Warsaw, Poland, is from the top of the Palace of Science and Culture,
because that is the only place in the city from which one cannot see this example of
Stalinist architecture at its worst. Being too close to the object of our scrutiny is inev-
itably a problem, which makes it all the more difficult—as well as important—to take
14 Prelude TO PA RT I
15
a close and careful look at ourselves, mindful that any such view (even, perhaps, the
evolutionary one promoted in this book) is liable to distortion.
Nonetheless, as T. S. Eliot proclaimed,11 and in which I suggest substituting “our-
selves” for “the place”:
Add to this, as well: “and not for the last time, either.”
NOTES
1. Lewis Mumford, The Myth of the Machine (New York: Harcourt, 1967).
2. Galileo, “Dialogue Concerning the Two Chief World Systems,” 1632.
3. Spinoza, Ethics.
4. Asimov cited in Maria Popova, “Isaac Asimov on Science and Creativity in Education,”
https://www.brainpickings.org/2011/01/28/isaac-asimov-creativity-education-science/.
5. Alice Roberts, The Incredible Unlikeliness of Being: Evolution and the Making of Us
(London: Heron Books, 2016).
6. L. Rosenbaum. “The March of Science—the True Story,” The New England Journal of
Medicine 377, 2 (2017): 188–191
7. Jean-Paul Sartre, Anti-Semite and Jew. (New York: Grove Press, 1962).
8. M. Cartmill, “Human Uniqueness and the Theoretical Content in Paleoanthropology,”
International Journal of Primatology 11 (1990): 173–192.
9. O. J. Loukola, C. J. Perry, L. Coscos, and L. Chittka. Bumblebees Show Cognitive Flexibility
by Improving on an Observed Complex Behavior. Science 135 (2017): 833–836.
10. Leslie White. The Science of Culture (New York: Grove Press, 1949).
11. T. S. Eliot. Four Quartets (New York: Harcourt, 1943).
1
The Journey to Brobdingnag
16
17
At the same time, Gulliver finds himself repulsed rather than aroused by their naked-
ness, because their huge size exaggerates the dimensions of their skin blemishes and
gaping pores.
Although Gulliver eventually escapes from Brobdingnag, he cannot get away from
his low opinion of the human species and his sense of its insignificance—a perspective
shared in the present book, although for our purposes the issue is not that people are
downright pernicious or odious but that when it comes to inherent significance they
are literally smaller than the Lilliputians were to Gulliver, or Gulliver to the giants of
Brobdingnag. It’s a perspective that may well be enhanced if we ever discover signs of
life on other “heavenly” bodies, or simply recognize the deeper consequences of the
fact that we occupy an average planet, orbiting a rather unexciting star in an out-of-
the-way corner of a comparatively trivial galaxy. And that we have come to exist as a
result of purely material processes (notably natural selection combined with the laws of
chemistry and physics), devoid of any deeper meaning or cosmic consequence.
This is not to say, however, that Homo sapiens isn’t important. We are crucially and
organically connected to all other life forms, which gives us a claim—although not
a unique one—to a certain expansive grandeur. We are also immensely consequen-
tial for ourselves, in the same way that members of a baboon troop or a human fam-
ily are important to each other. Moreover, we are important in ways that go beyond
our significance as individuals and as organic beings, in that we—more, perhaps, than
any other species—have already had immense practical impact on our planet and its
creatures, and promise (more accurately, threaten) to do even more. Environmental
scientists—beginning, it appears, with Nobel prize–winning atmospheric chemist Paul
Crutzen—have argued for some time that we are living in our own human-created era,
the Anthropocene, a time in which the cumulative effect of our activities dominates
the machinery of Earth. Geologists resisted this concept, maintaining that establishing
a new, recognized epoch requires not only a clear origination point but also something
that constitutes a permanent and worldwide demarcation, equivalent, for example, to
the extinction of the dinosaurs nearly 70 million years ago, which marked the end of
the Cretaceous. Writing in the International Geosphere-Biosphere Programme news-
letter in 2000, Crutzen and fellow atmospheric scientist Eugene Stoermer urged none-
theless that, given the key role played by human beings in Earth’s planetary ecosystem,
the concept of Anthropocene (“human era”) was fully appropriate.
As to when precisely the Anthropocene began, Crutzen and Stoermer suggested that
to assign a more specific date to the onset of the “anthropocene” seems some-
what arbitrary, but we propose the latter part of the 18th century, although
we are aware that alternative proposals can be made (some may even want to
include the entire holocene). However, we choose this date because, during the
past two centuries, the global effects of human activities have become clearly
noticeable. This is the period when data retrieved from glacial ice cores show
the beginning of a growth in the atmospheric concentrations of several “green-
house gases”, in particular CO2 and CH4. Such a starting date also coincides
with James Watt’s invention of the steam engine in 1784.1
Other possible markers for the commencement of the Anthropocene include the
early 1950s, when atmospheric nuclear testing added layers of radioactive fallout
worldwide, the (geologically speaking) nearly instantaneous accumulation of alumi-
num, plastic, and concrete particles—notably in the oceans, the suddenly high global
soil levels of phosphate and nitrogen derived from fertilizers, and so on, even for some,
the widespread appearance of domestic fowl, whose bones can now be found in geo-
logic deposits throughout the globe. Regardless of the precise demarcation point,
which has yet to be agreed, in 2016 the Working Group on the Anthropocene recom-
mended overwhelmingly to the International Geological Congress that this new epoch
be recognized.
For our purposes, the key point is that anthropodiminution is very different from
“anthropodenial,” a refusal to acknowledge that human beings have been exerting an
immense influence—much of it malign—on the planet Earth. The Anthropocene is
real. So is anthropocentrism, the conceit that figuratively, if not literally, the universe
revolves around Homo sapiens. But anthropocentrism is “real” only in the sense that
many people believe it, even though it isn’t true.
NOTE
1. Crutzen, P., and E. Stoermer. “International Geosphere Biosphere Programme (IGBP)
Newsletter, 41” (2000).
18 T h e Al lu re of H um a n C e n t ra l i t y
19
2
From Centrality to Periphery
According to Francis Bacon, in his essay Prometheus: or the State of Man, com-
posed roughly five centuries ago, “Man, if we look to final causes, may be regarded as the
centre of the world . . . for the whole world works together in the service of man . . . . All
things seem to be going about man’s business and not their own.” This is classic anthropo-
centrism, a perspective that is comforting and not uncommon, although it is completely
wrong.1 Think of the mythical, beloved grandmother, who lined up her grandchildren
and hugged every one while whispering privately to each, “You are my favorite!” We
long to be the favorite of god or nature, as a species no less than as individuals, and so,
not surprisingly, we insist on the notion of specialness. The center of our own subjective
universe, we insist on being its objective center as well. This is the same error that led
Thomas Jefferson to react as follows to the discovery of fossil mammoth bones in his
beloved Virginia: “Such is the economy of nature, that no instance can be produced of
her having permitted any one race of animals to become extinct.” And maybe, even now,
in some as yet undiscovered land, there are modern mastodons, joyously cavorting with
giant sloths and their ilk, testimony to the unflagging concern of a deity or at minimum,
a natural design, that remains devoted to all creatures . . . especially, of course, ourselves.
Just don’t count on it.
It might be useful to introduce a new word to go with anthropocentrism: “modo-
centrism” (derived from the Latin for “now” or “current”). Modocentrism would refer
to the peculiarly widespread idea that modern times—or whatever era or date is cur-
rently in question—are unique in history. Of course, as Heraclitus pointed out, one
cannot step in the same river twice; every situation and every slice of time is unique
unto itself. However, there seems to be a widespread illusion that present days are truly
exceptional, either in how good they are, or (more often) how bad, also in how inter-
esting they are, how consequential, and so forth. Modocentrism is thus equivalent to
each of the grandmother’s “favorites” believing that he or she is not only special but
also occupies a unique and notable time frame.
Modocentrism is operating when people announce—as they have apparently been
doing throughout history—that children have never been as ____(fill in the blank) as
they are these days. Or that we are living in a time of extraordinary peril, in particular
19
20
because of the danger posed by global climate change and the threat of nuclear apoca-
lypse. In fact, a strong objective case can and should be made that because of these twin
looming disasters, we are indeed living in a time of extraordinary peril! Maybe this
is yet another example of modocentrism, in which case it reveals how encompassing
such a perspective can be; on the other hand, just as Freud is reputed to have noted that
sometimes a cigar is just a cigar, sometimes what appears to be a time of unique dan-
ger may really be a time of unique danger. Identifying the Anthropocene might in any
event be seen as a case of modocentrism. But just as paranoids, too, can have enemies,
sometimes modocentrism is correct.
It is easy, nonetheless, to fall victim to erroneous modocentrism, as with the assertion
that the twenty-first century is unique, for example, in how disrespectful children are to
their parents, how busy people are (or how much free time they have), how bombarded
they are with information, how disconnected they are from each other on a human level,
how much violence we experience (or how peaceful most human lives have become),
compared with some—
or any—
other times. It is also tempting to conclude that
modernity—at least, beginning with the physics revolution of the twentieth century—has
been uniquely disorienting for humanity. Thus, quantum mechanics raises weird issues
regarding cause and effect, just as relativistic mechanics does to the meaning of space, while
both revolutions have effectively upended the seemingly unitary “arrow of time.”
Also in the twentieth century came Alfred Wegener’s theory of continental drift,
first published in 1912 and originally scorned by the overwhelming majority of geolo-
gists, but now abundantly confirmed as part of our general understanding of plate
tectonics. This was also roughly the same time that, as already mentioned, Freud was
upending confidence in our own conscious, rational mind.
And yet, these were not the first disorienting episodes to confront our species.
Especially notable in the nineteenth century was Darwin’s identification of evolution
by natural selection as the mechanism by which all living things—including human
beings—have been and are still being “created.”
It is no exaggeration to note that the most significant yet widely denied consequence
of biological insights is the fact that Homo sapiens, like the rest of the natural world,
were produced by a strictly material, altogether natural process, namely, evolution by
natural selection. “Descended from apes?” the wife of a prominent Victorian bishop is
reported to have expostulated. “Let us hope it isn’t true! But if it is true, let us hope that
it doesn’t become widely known!” Well, it is true, and it is widely known among anyone
scientifically knowledgeable, although according to a 2014 Gallup survey, an astonish-
ing 42% of Americansa believe that God—not evolution—created human beings, and
did so within the last 10,000 years.
a
Including the vice president of the United States, elected in 2016.
20 T h e Al lu re of H um a n C e n t ra l i t y
Exploring the Variety of Random
Documents with Different Content
Kaakiho, 396.
Kaea, 404.
Kaeelekoha, 406.
Kaeha and Kaulu trick puzzle Kane and Kaneloa who send messengers to inquire of
Makalii, 524.
and spirits prepare awa, 524.
at birth of Kaulu as a piece of rope, placed on the shelf, 522.
at call of Kaulu, comes out of the shark bald-headed, 528.
directed by Kaulu in awa drinking, 524.
enticed by spirits to go rod-fishing, 530.
first-born of Kukaohialaka and Hinauluohia, 522.
is carried off by the spirits to Kane and Kanaloa, 522.
Kaulu missing, starts off in search, 522;
Makalii, inquired of, said, “Your brother is in the shark”, 528.
killed and put into an opihi shell, 530.
left at Papakolea, Moanalua, 530.
looking for food, is taken by Kaulu to Manowaikeoo, 526.
loved and esteemed Kaulu, 522.
Makalii locates the shark which swallowed, 528.
missed, is searched for by Kaulu, 530.
not dead, is again tempted by the spirits, 528.
observes Kaulu’s directions in awa drinking, 524.
questions Kamano why kill the next child, 522.
released and brought to life again by Kaulu, 530.
returns to the house to join the spirits, 524.
seen and recognized by Kaulu, 524.
sharks called together to take, to their king and is swallowed whole, 526.
tempted by the spirits to go surf-riding, 526.
whereabouts of, searched for in vain, 528.
Kaenakulani, 24.
Kaha, 378.
Kahakuakea, 406.
Kahakuikamoana, historian, 2, 4, 6, 10.
Kahalakala, 376.
Kahaloalenaula, 24.
Kahauiki, 400.
Kahihiokalani, 404.
Kahikiku, arriving at, the turtle disembarked Laukia and disappeared, 604.
as the clouds drifted toward, Laukia chanted her love plaint, 602.
Kahalaokolepuupuu of, 602.
Kahikiula arrived from, 602.
Kahikiula begged permission to return to; departs for, to live with first wife, 602.
the turtle swam to, 604.
Kahoowaha, 394.
Kahuaike, 400.
Kahualewa, 382.
Kahulikini, 340.
Kai, 404;
a ka hulu manu; kea, 378.
Kaiakea, 406.
Kaihikapu, 394;
son of Kuhihewa, 242.
Kaihikapualamea, 24.
Kaio, 396.
Kaiokane, 340.
Kaiowahine, 340.
Kaipapau, 428.
Kaiua, 180.
Kaiwi, 396.
Kaiwilaniolua, 24.
Kakaihili, 14.
Kakuhihewa or Kuhihewa, accepts Lono’s wager, his feather kahili, against the
inside of the house, 280.
acts on advice of priest, 466–68.
admits Lono knows the chant and is beaten, 288.
advised of Hauna’s arrival from Hawaii; sends a fast runner to find and kill him,
310.
advised of the approach of the king of Hawaii, 274.
advised to ask the chiefess of Kauai for a new chant; approaching the canoe,
reaches out and holds her, and asks if a new chant of Kauai has been heard,
276.
after committing the chant to memory, goes surfing, 276.
and companions set out in their fishing canoe, 290.
and Kepakailiula rights as rulers reserved, 510.
and Lono in fishing contests, 290–98.
and servants return to the house after committing the new chant to memory,
276.
asked by Lono for fishing tackle, 296.
asks if chant is in honor of king of Hawaii, 280.
at Lono’s suggestion, makes first recital of the chant, 282.
at report of farmer, seeks for the wounded warrior, 470.
aware of Lono’s fame at hoopapa, makes ready for a contest, 274.
beaten in all his wagers, 298–300.
begs the king of Hawaii to restore him Oahu, 308;
re-pledges it, with chiefs, in a new contest, 310.
challenges Lono to name his fish caught, and wagers thereon, 204–96.
claims Kauai chant as in his honor, 278.
claims the Mirage of Mana chant, 278–82.
defeated by Lono, plans a new contest by fishing, 290.
defeats Pueonui, 468.
desired a mooring rock sent for, 292.
desires possession of Pueonui’s lands, 468.
displeased at Lono’s canoe, moored out of place, 294.
double canoe of, drifts in fierce wind; notices the holding power of Lono’s rock,
294.
easily led by Lanahuimihaku, 290.
engaged in contest with Lono over the bones of six chiefs; Hauna the subject of
dispute, 310.
favors Lanahuimihaku’s plan of contest, 278.
forbids Loli taking his ward’s things until chant in his honor is recited, 278–80.
hears Lono’s response chant, 306.
in reply to Lono’s claim to the chant said “We will know after you have recited
it,” 282.
is shown the bones of the chiefs killed in battle, identified by Hauna and
admitted by Lanahuimihaku, 314–20;
thereby losing Oahu, 320.
king of Ewa and adjoining districts, 464.
king of Oahu, 242;
kings prior to, 408.
king of Oahu, in fear through death of Kakaalaneo, takes the name of
Kepakailiula, the victor, and adopts him, 510.
Kualii’s father a great-grandson of, 408.
Lanahuimihaku and companions former favorites with Lono join; they cause him
and his people trouble, 278.
lit. definition, 466.
living at Kailua, 274.
Lono carried to palace of, 274;
outside the palace of, 278.
loses again to Lono, 296–98.
makes the chant the subject of a contest with Lono, 280.
messenger of, passes by in ignorance, 212.
not told of Lono having already been taught it, else it would not have been a
subject of contest, 276.
offers nearly all Oahu lands as against Lono’s feather kahili, 280.
on return from surfing is urged to a contest with Lono, 276–78.
orders the people to leave the house to Lono, 288.
palace of, 274;
Kamoa, 280.
proceeds to master the chant taught by the chiefess, 276.
residing at Waikiki, Oahu, 510.
seeing the people crowd back, questions, 288.
seeks subjects for contest with Lono, 274.
sends for Kepakailiula and gives him the whole of Oahu, 510.
sends to bring Kalelealuaka and Keinohoomanawanui to Ewa, 468.
serves under Kalelealuaka, 470.
spy of, hearing the scheme, strikes a dagger at entrance of house; repeats
Kalelealuaka’s wish to the king, 466.
taunts Lono for not coming prepared to fish, 296.
thinks to beat Lono; asks again of him if Hauna has arrived, 310.
thwarted in plan to lose his shark, 296.
time of reign of, 364.
told of fruitless search for Hauna, 310.
told of his foolish bet, cries for mercy, 288–90.
told the chant is a very late one, in honor of the chiefess; he learns its title is
the Mirage of Mana, 276.
unaware of Kalelealuaka’s acts, finds him the cause of Pueonui’s defeat, 470.
urged by Lanahuimihaku for a new contest, to save themselves, 308.
[xv]vexed, sends out a spy, 464–66.
wagers his daughter on a game of konane, 300–2;
is beaten by Lono; game stopped by arrival of Kaikilani, 302.
wagers with Lono on a canoe race and loses, 300;
on his mooring rock, 294–96;
on his fish catching, 294–98.
Kalahuimakani, 388.
Kalahuimoku, 180.
Kalalau, 396.
Kalalea, 286, 304.
Kalama, 396.
Kalamahaaiakea, 396.
Kalamaku, 240.
Kalamea, 180.
Kalani, 4, 240.
has encircled Kalihi, 394.
languishing chief of Kaiwa, 26.
name given to high chiefs, 394.
(the heaven), 4;
the heavenly one, 14.
Kalanialonoapii, 4.
Kalanianoho, 370.
Kalanilonaakea, 240.
Kalanimakahakona, 4.
Kalanipaumako, 24.
Kalaniwahine, 24.
Kalino, 560;
Alani the wood of umu for, 566, 568.
asking for the chiefess, is bid enter the house, 564;
is beheaded, 564.
body of, cooked in the umu; bones thrown in ahuawa heap, 566.
head of, asks for the fault, 564, 566.
Kaulanapokii calls in chant for, 568.
recognized by Hikapoloa, 564.
suggests sailing to Kohala for food, 564.
turn of, 564.
Kalopa, 192.
Kamakahikikaiakea, 306.
Kamakahinuiaiku, 32.
Kamakaoholani, 370.
Kamakauwahi, 372.
Kamanawakalamea, 180.
Kamananui, 396.
Kamawaelualanimoku, 18;
ancient name of Kauai, 14.
born of Papa, 18.
Kamea, 25, 405.
Kana and Niheu board canoe with their father and sail, 444.
Keauleinakahi ordered to pierce the double canoe and kill, 444.
Kolea and Ulili to look for, 444.
legend of, 436.
make preparations to sail, 442.
Kanahae, 180.
Kanaiki, 240.
Kanakaokai, 86.
Kanaloa, 404;
an island, child of Papa, 12.
deity, 394;
depths of, 22.
drooping leaves of, 240.
face of, blackened with fire, 342.
god of Kana, shall be the, 440.
Kahoolewa, 286, 302;
kin of, 342.
of Waia, 382.
one of the major gods, 440.
sacred knife of, 20.
Kanaloawaia, 420.
Kanamuakea, 382.
Kanananuu, 358.
Kanehunamoku, land of, recognized by Kaneapua; Wahanui and party leave the,
518.
the land of, appears in form of a dog, 518.
Kaneikauauwilani, 14.
Kaneimakaukau, 380.
Kanemakaiahuawahine, 394.
Kanemakua, 94.
Kanikaa, 558;
spirit chief of Hawaii, 476.
Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.
textbookfull.com