Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Cognitive Biases

Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

October

2020

Cognitive Biases:
Causes, Effects, and Implications for Effective
Messaging

Quick Look

Sabrina Polansky, Ph.D., NSI Inc.


Tom Rieger, NSI Inc.

POC: Sabrina Polansky, spolansky@nsiteam.com

Prepared for:
Strategic Multilayer Assessment
Integrating Information in Joint
Operations (IIJO)

DISTRIBUTION STATEMENT A. Approved for public release: distribution unlimited.


Cognitive Biases 1

Cognitive Biases: Causes, Effects, and Implications for Effective


Messaging
“What we’ve got here is failure to communicate.”
Cool Hand Luke

Background
In the current environment of contested norms, facilitate survival by enabling more efficient
increasing great power competition, and rapid processing of information from the environment.
technological change, protecting and furthering Unfortunately, these same shortcuts hardwire
US interests requires the ability to compete biases into our thinking and communication,
effectively in the information environment (IE). In which render messaging efforts ineffective and
recent years, the IE has seen the proliferation of open to manipulation by adversaries seeking to
actors, channels of communication, and mislead or confuse. These cognitive biases can
messages. To compete in this crowded and lead to inaccurate judgments and poor decision-
contested environment, we must be able to making that could trigger either unintended
communicate effectively. Yet, the IE is escalation or failures to identify threats in a
complicated, available information is often vast timely manner. Understanding sources and types
and imperfect, time is constrained, and of cognitive bias can help minimize
additionally, we are limited by our own cognitive miscommunication and inform development of
capacity. better strategies for responding to adversary
attempts to leverage these biases to their
Humans have developed several adaptations to advantage (see Figure 1 for an overview of bias
address these cognitive limitations, including during the communication process).
heuristics and other mental shortcuts that

INTENTIONALITY and
SOURCE of Bias Result
Effect of Bias
Ineffective communication

Initial Unintentional
Distortion of Info search
message on
Conditions Adaptation ENCODING
The world is Info framing
Use of
complicated, and
heuristics Cognitive Bias
information is both and other in the
vast and imperfect.
shortcuts communicator
for Info
We are often in a weighting
survival
hurry, and human Distortion of
cognition is limited. Intentional message on Info
DECODING interpretation

Inaccurate judgments,
Why do we
poor decisions,
care? misunderstandings

Figure 1: Bias during the communication process—from initial conditions to why it matters
Cognitive Biases 2

Human beings have two basic ways of processing Moreover, when triggered during
information1 (Chaiken, 1987; Kahneman, 2011; communication, cognitive biases may reinforce
Lieberman, 2007; Petty & Cacioppo, 1986; one another. For example, failure to recognize
Stanovich, 1999). One way is through deliberative that multiple perspectives exist may cause us to
logical reasoning, which requires a significant make erroneous inferences about others—
amount of high-quality information, is costly in making it more likely that communication will go
terms of cognitive processing, and thus is rarely awry. Such ineffective communication may
used. The other way is more heuristic-based and reinforce certain cognitive biases (e.g., bolstering
also influenced by emotional and moral motives, negative beliefs about a communicator or group),
as well as social influences (Hilbert, 2012).2 which increases the likelihood of further
Heuristic-based processing3 is largely adaptive— distortion. Becoming more aware of our biases
enabling people to efficiently process large and how they affect communication is an
amounts of information and simplify their important first step in interrupting this chain
judgments (Gigerenzer, 2008; Gigerenzer et al., reaction. Toward that end, the remainder of this
2011; Gigerenzer & Todd, 1999; Haselton et al., paper will focus on exploring the items in the
2009; Kahneman, 2011). However, the use of colored boxes in Figure 1.
heuristic-based processing or mental shortcuts
can also make people vulnerable to certain Types of Bias
cognitive traps, or biases, when these shortcuts A review of relevant academic and professional
result in incorrect inferences about the world literatures can return lists of several dozen to
(Kahneman, 2011; Tversky & Kahneman, 1986). several hundred types of cognitive bias (e.g.,
As illustrated in Figure 1 above, these cognitive Dixon, 2019; Baron, 2007, as cited in Hilbert,
biases can be invoked either intentionally or 2012). As our emphasis here is on understanding
unintentionally, have a multitude of effects on
what makes communication effective in the IE (as
information processing, and thus impact the
conceptualized in the Department of Defense’s
overall effectiveness of communication. doctrine, Joint Concept for Operating in the
Specifically, when it creates distortion4 either Information Environment [JCOIE]), we focus on a
during message encoding by the sender (i.e., subset of the most relevant biases. The objective
translating ideas into communication) or during is two-fold: to aid planners and decision makers
message decoding by the recipient (i.e.,
in 1) recognizing, and thus being able to counter,
translating communication into ideas), cognitive the intentional exploitation of cognitive biases to
bias can result in ineffective communication. persuade populations; and 2) knowing which

1 A comprehensive discussion of information processing is beyond the scope of this report. However, as information processing
fundamentally depends on internal representations of information, message encoding and message decoding (discussed below)
can be thought of as subsets of information processing for the current purpose (Gazzaniga et al., 2002).
2 For simplicity, we will refer to this set of information processing orientations collectively as heuristic-based processing.
3Heuristics are often studied and understood within the context of judgment and decision-making (Gilovich & Griffin, 2010). Here,
we explore their application to the communication context, focusing only on the downstream cognitive biases that have the
potential to distort messaging.
4For additional sources of distortion (e.g., communicators’ world views), see the full communication model developed for this
project. Contact Dr. Belinda Bragg (bbragg@nsiteam.com) for more information.
Cognitive Biases 3

Table 1: Categorization of Cognitive Biases Relevant to the Information Environment

Distortion of message upon ENCODING Distortion of message upon DECODING


Information Information Information Information
Search Presentation Weighting Interpretation
Availability bias Belief in a just world Authority bias Anchoring & adjustment bias
Confirmation bias Curse of knowledge Bandwagon effect Backfire effect
Negativity bias* Endowment effect Base rate fallacy Belief bias
Optimism bias* False consensus effect Focusing effect Belief perseverance
Salience bias Fundamental attribution error * Framing effect* Framing effect*
Hot-cold empathy gap Hostile attribution bias* Fundamental attribution error *
Hyperbolic discounting Hot hand fallacy Halo effect
Illusion of control Ingroup-outgroup bias Hostile attribution bias*
Illusory correlation* Mere exposure effect Identifiable victim effect
Naïve realism* Negativity bias* Illusory correlation*
Status quo bias Optimism bias* Illusory superiority
Ultimate attribution error* Reactance Naïve realism*
Reactive devaluation Negativity bias*
Subjective validation Pluralistic ignorance
Ultimate attribution error*
Zero sum bias

Note: * indicates a cognitive bias can emerge at multiple stages in the communication process, and is thus
assigned to more than one category. Bolded items are discussed in the body of this paper.

biases limit the accuracy and robustness of Intentionality, Source, and the
communications meant to inform, as well as
Likelihood of Cognitive Bias
impact how messages are received and decoded.
The resulting 38 types of cognitive bias are listed Cognitive bias can occur either intentionally or
in Table 1, categorized according to how and unintentionally. This intentionality influences
when they affect information processing.5 For the where in the communication process message
sake of brevity, we discuss a subset of these distortion occurs—that is, upon encoding or
biases in the following sections. Further decoding. One way to examine this issue is to
information about these biases, as well as those explore the difference between communication
not discussed, can be found in Appendix A. whose goal is to persuade (which may in some
cases intentionally aim to trigger cognitive biases)
and communication whose goal is to inform
(which should aim to avoid succumbing to

5While several of these biases arguably may be assigned to multiple categories, we endeavor to categorize them based on where
they are most likely to be invoked, but allow for multiple assignments where necessary.
Cognitive Biases 4

cognitive biases but may nonetheless do so for and presentation of information,


unintentionally). unintentionally distorting the encoding of the
message that they send.
Attempts to persuade or change the attitudes
and/or behaviors of a recipient often employ Table 2: Factors Increasing the Likelihood of Cognitive Biases
messages that are designed to disrupt how and Their Effect on Information Processing
recipients weigh or interpret information. This

Interpretation
Presentation
might be accomplished by focusing on only one

Information

Information

Information

Information
Weighting
side of a story rather than equally and objectively

Search
Factors that increase
presenting both sides. This type of persuasive likelihood of bias

communication (e.g., propaganda, advertising, Time pressure

information operations,6 and “fake news”) puts Conflicting information

forward a specific narrative to encourage the Too much information


perception or outcome desired by the sender,
Unknown unknowns
rather than leaving the recipient to more freely
Uncertainty
weigh and interpret available information. In this
Distracted
situation, it is important to note that intention is
related to the accuracy of message encoding. If Less invested in issue

someone crafts a message to intentionally trigger Strong emotion

a cognitive bias and achieves that outcome, no Limited cognitive resources


encoding error has occurred. Instead, the sender
Preconceptions
has achieved his or her goal (in this case,
Worldview
successful persuasion) if cognitive bias is
Mental or physical fatigue
triggered in the recipients upon message
decoding (see Figure 1 above). Threat to self or group

Low “need for cognition”


(do not enjoy thinking)
In contrast, attempts to inform a recipient by
providing information that is both accurate and Note: Shaded cells indicate that a given factor can distort
that aspect of information processing.
reasonably complete (e.g., situation reports,
objective studies) may become derailed Certain features of the communicator and of the
unintentionally when the people constructing the information environment may contribute to
messages fail to recognize that their own biases heuristic-based processing (see Background
are distorting message encoding. Similarly, failed section above), making cognitive biases more
(versus successful) attempts at persuasion may likely to occur in either encoding or decoding
also indicate that an unintentional encoding error (e.g., Eagly & Chaiken, 1993; Fiske & Taylor, 1991;
has occurred. In both cases (i.e., attempts to Gilbert et al., 1988; Peer & Gamliel, 2012; Petty &
inform and failed attempts at persuasion), Cacioppo, 1986). Table 2 presents some of the
cognitive bias in the sender can affect the search more common factors, indicating which aspect of

6We adopt the definition of “information operations” utilized in JP3-13: “The integrated employment, during military operations,
of information-related capabilities in concert with other lines of operation to influence, disrupt, corrupt, or usurp the decision
making of adversaries and potential adversaries while protecting our own.”
Cognitive Biases 5

information processing (i.e., information search, conjunction with salience bias—the tendency for
presentation, weighting, or interpretation) they people to focus on more prominent information
can affect. As the table shows, it can be difficult to the exclusion of other potentially relevant
to avoid falling into cognitive bias traps given the information (e.g., thinking about the story of
prevalence of many of these environmental greatest relevance to them rather than the full
factors and the pervasiveness of their effects on set of topics presented in recent news coverage).
information processing. However, developing a Similarly, one can imagine how a situation report
solid understanding of the potential biases and intended to deliver actionable information on
their effects provides a good starting point from local road closures could be distorted if the
which to counter them. We explore this topic in person constructing the report primarily relied on
greater detail in the sections below. information that was easily accessible or most
frequently reported. The unfortunate result
Cognitive Biases That Influence would be that the decision-maker receiving the
Message Encoding by the Sender report would unknowingly be steered toward
making a decision based on incomplete
Cognitive bias can distort how messages are
information (and thus, biased communication).
encoded in one of two ways. It can affect the
sender’s information search, resulting in Other examples of biases that can affect
restrictions on the search for and selection of information search in ways especially relevant to
information, or it can affect information operations in the information environment (OIE)
presentation, distorting how information is are:
construed and subsequently presented. The
effects of individual cognitive biases will, • Confirmation bias: people search for, and
moreover, be compounded if they occur together focus on, information or evidence that
with others. supports a pre-existing belief, and give this
evidence greater credence than
Distorting the Search for Information information that would disconfirm their
The tendency for people to seek out information belief.
that comes readily to mind when making • Negativity bias: a person’s psychological
judgments about the frequency or probability of state is more strongly impacted by
future events is known as the availability bias. negative rather than positive information.
This common bias can distort the search for As such, he or she may more readily notice
information by constraining the range of inputs and recall negative events, outcomes, or
used, decreasing the effectiveness of a given feelings.
message. For example, when asked about which
issues are most important in the United States, Distorting the Presentation of Information
people often respond by indicating those that The curse of knowledge is a cognitive bias that
have received recent media coverage (and thus describes the tendency for better-informed
are easily recalled), rather than thinking deeply people to find it difficult or impossible to think
about the broader range of potential issues and about a situation from the perspective of
selecting from that set of possibilities. Availability someone who is not privy to the same
bias may be compounded if it occurs in information or knowledge. It is a bias that can
Cognitive Biases 6

significantly affect how the knowledgeable Implications of Message Encoding


person frames a message (i.e., what the
Biases for Operators and Planners
communicator chooses to emphasize). Lacking
the ability to appreciate what others know, as In order to minimize the effects of cognitive
well as their level of understanding of the topic at biases on message encoding, it is critical for
hand, can easily derail a message; this results operators and planners to recognize these biases
from a failure to include information upon as well as the triggers that cause them (see
message encoding that is necessary for the Appendix A for additional information). This is
recipient to unambiguously decode it as it was true both for crafting communication that is
intended. An infamous example of military intended to persuade and crafting
miscommunication, the Charge of the Light communication that is intended to objectively
Brigade during the Crimean War, may be partially inform. The potential for message distortion
explained by the curse of knowledge (Pinker, resulting from the cognitive bias of the sender
2014). The British commander in Crimea, Lord can be minimized by training operators and
Raglan, undoubtedly thought that his order to planners to recognize and avoid biases, along
“advance rapidly to the front—follow the enemy with red teaming and peer review of intended
and try to prevent the enemy from carrying away communication (e.g., individual messages,
the guns” was clear. The order was, however, reports) that should similarly help to identify and
ambiguous in light of the known situation on the mitigate bias. Potential messages can also be pre-
ground. This left the order up to interpretation tested with a sample audience and revised based
both by the recipient (Lord Lucan, the cavalry on the feedback that is received.
commander) and the intermediary (Captain Louis
It is also important to establish the knowledge
Nolan) who delivered the message—resulting in
level of message recipients before crafting a
a disastrous frontal assault against the wrong
message. If this is not possible, the message
artillery battery (BBC HistoryExtra, 2018).
sender should assume the recipient has minimal
Other examples of biases that can affect knowledge of the topic, and craft the message to
information presentation in ways especially be as descriptive and specific as possible. In
relevant to OIE are: crafting the message, the sender should also
keep in mind whether he or she is trying to
• Optimism bias: people underestimate the engage the recipient’s deliberative logical
probability of adverse or catastrophic reasoning (harder task) or heuristic-based
outcomes, which can result in inadequate processing (easier task) (see Background section
contingency planning and taking above).
unnecessary risks.
• Naïve realism: people naively believe that Cognitive Biases That Influence
they see the world objectively and without Message Decoding by the Recipient
any bias. This leads them to believe that
Even when a message is encoded and
“rational people” will agree with their
transmitted as intended, cognitive bias can affect
perception of the world, and that those who
how messages intended either to persuade or
do not agree with them are irrational,
inform are decoded. This creates one of two
uninformed, or biased.
kinds of distortion. The first affects information
Cognitive Biases 7

weighting, influencing the relative emphasis that Dimitrov, 2019; Fernholz, 2018; Horowitz &
is placed on different aspects of incoming Alderman, 2017). It is possible that decision-
information. The second affects information makers in those countries succumbed to the
interpretation, influencing how incoming focusing effect, thinking about the perceived
information is understood. Once again, the future benefits to such investment without fully
effects of individual cognitive biases will be taking into account the potential pitfalls (or even
compounded if they occur together with others. purposefully dismissing them, as a result of
additional cognitive biases).
Distorting the Weighting of Information
The focusing effect,7 which causes the message Other examples of biases that can affect
recipient to place too much emphasis on one information weighting in ways especially relevant
aspect of an event or issue, while neglecting to OIE are:
other potentially important information, is a clear
• Authority bias: people assume that the
example of distortion of information weighting
opinions of an authority (e.g., recognized
(Brickman et al., 1978; Gilovich et al., 2019;
figures, leaders, or experts) are more
Kahneman et al., 2006; Wilson et al., 2000). The
accurate, increasing the likelihood that a
focusing effect may also be compounded if
message will be accepted.
combined with other weighting biases, such as
the bandwagon effect (the tendency to do or • The mere exposure effect: occurs when
people come to like something more upon
believe things because many other people do or
repeated exposure to it, resulting in a
believe the same). The focusing effect is
particularly likely to result in miscommunication preference for familiar objects or people. In
fact, one of the most commonly used
and inaccuracy in affective forecasting—or
metrics of effectiveness in advertising is
judgments of how we will feel in the future
target ratings points (TRPs), which measure
(Wilson et al., 2000). Policy initiatives, public
the number of times a target audience is
health appeals, and cooperation requests
provide ample opportunity for this bias. This exposed to a message.
suggests that it may be a concern in how Distorting the Interpretation of Information
messaging regarding US military engagement or The anchoring and adjustment bias occurs when
proposed cooperation will be decoded. Recently, the first information a person encounters
US adversaries seem to have purposefully provides an initial “anchor” that acts as a
exploited the focusing effect. For example, China benchmark against which other information is
has crafted a narrative positioning itself to evaluated. This bias affects the way in which
potential partners as a “no strings attached” recipients interpret incoming information in a
investor in their economies. Willing recipients of wide range of daily situations—from salary
such investment have, however, discovered that negotiation and real estate sales to medical
it does in fact come with significant risks (Abi- diagnoses and determining what constitutes a
Habib, 2018, Baboi, 2019; DeAeth, 2018; “good deal” (Epley & Gilovich, 2006; Tversky &

7 This bias may be more likely to occur in Western versus Eastern cultures, as the West tends to place greater emphasis on context-
independent and analytic (vs. holistic) perceptual processes (Lam et al., 2005; Nisbett & Miyamoto, 2005).
Cognitive Biases 8

Kahneman, 1974). For example, initial reports Distorting Both Information Weighting and
that conclude a situation is threatening may Information Interpretation
shape how subsequent intelligence is Sometimes one bias can do two things at once,
interpreted. If new information is discovered that distorting both how recipients weigh incoming
suggests the situation is not threatening, an information and how they interpret information.
adjustment may be made to the assessment. This can be achieved intentionally by crafting
However, that adjustment may fall short of messages to highlight certain pieces of
where it would have been if the planner had information over others—for example, by
started with the newest information. The emphasizing the risks rather than the benefits of
anchoring and adjustment bias is particularly a given choice (Nabi, 2003; Nelson et al., 1997;
likely to emerge in situations where assessment Tversky & Kahneman, 1981). Such framing has
requires frequent incorporation of new been shown to have a powerful influence on
information. For example, receiving a one-time people’s decisions within multiple domains,
report that puts population support for including negotiation, public goods allocation,
continued US military presence at 30% is likely to and voting (Kahneman & Tversky, 1984; Levin,
be perceived as really low. However, if that same 1987; Levin et al., 1985; Neale & Northcraft,
figure (30%) follows a prior report that placed 1986; Quattrone & Tversky, 1988). These framing
support at 20%, then it is likely to be interpreted effects can also be leveraged by those seeking to
as “good,” unless the analyst can uncouple the influence a population or government and shape
new figure from the old. perception of world affairs. For example, framing
a conflict as a fight between good and evil creates
Other examples of biases that can impact
perceptions of right and wrong, and can shape
information interpretation in ways especially
public support for specific policy actions against
relevant to OIE are:
the “evil” side (Brewer, 2006).8
• Hostile attribution bias: information from
Sometimes two biases can work together to
certain actors is assumed to have hostile or
produce distortions in information weighting and
nefarious intent, and is therefore not
in information interpretation. Consider the
trusted. If the source is viewed in this way,
ingroup-outgroup bias, which influences people
any messages from that source will likely
to think in terms of rigid “us versus them”
be viewed with a great deal of skepticism.
categorization, treating ingroup members in a
• Belief perseverance: an audience continues
preferential way. In this context, messages that
to hold on to previous beliefs and opinions
attempt to portray an outgroup’s rights may be
even after they have been corrected.
viewed with some skepticism. This bias may co-
When faced with this bias, it may be very
occur with related biases such as the ultimate
difficult to change attitudes through
information operations (see footnote 5).

8 Framing effects themselves can dovetail with or invoke other biases. In this particular case, framing of the ingroup as the “good
side” would be likely to stimulate the biases discussed in the next paragraph.
Cognitive Biases 9

attribution error,9 which is the tendency for competitive or gray zone actions. Arguably, it was
people to interpret the negative behavior of biases such as the ingroup-outgroup bias and
outgroup members as driven by character, and ultimate attribution error that Russia invoked in
their positive behavior as due to external or its 2016 influence campaign on Facebook, which
circumstantial causes (Greenwald & Pettigrew, used hot-button issues to further divide
2014; Pettigrew, 1979, 2001; Tajfel & Turner, American citizens (Frenkel & Benner, 2018; see
1986; Tajfel, 1982). The ultimate attribution error also Wong et al., 2020 for a related discussion on
can be leveraged for nefarious purposes by China). For example, Russia’s Internet Research
justifying wide-scale action against, or Agency created pages focused on social issues
denigration of, a particular part of a population. such as religion, policing, and especially race,
The classic example of this is the historic then crafted ads to sow discord and division
mistreatment and vilification of the Jewish (Frenkel & Benner, 2018; Stewart, 2018). These
people any time the majority or dominant group messages played upon the common human
needed a scapegoat. tendency to think in terms of group membership
and give preference to the ingroup, while
Implications of Message Decoding providing ample fodder to encourage people to
Biases for Operators and Planners attribute any differences in opinion or behavior
to the inherent “wickedness, immorality, or
Understanding the biases that can occur during
stupidity” of the outgroup.
message decoding can help operators and
planners reduce the likelihood of unintentionally A solid understanding of the cognitive biases
triggering misperceptions on the part of the discussed in the section above (and described in
receiver that can lead to unintended Appendix A to facilitate familiarization) will assist
consequences, such as dispute escalation. planners and operators in developing more
Understanding which biases are relevant and effective strategies for persuasive
present, as well as understanding how they communication by reducing distortions in
operate, can also be critical in “inoculating” message decoding. Leveraging heuristic-based
ourselves against the attempts of others to processing and recognizing the cognitive
communicate information in ways that exploit characteristics of human nature are basic parts of
vulnerabilities in our own decoding. Knowing campaigns designed to persuade. As long as
which biases are being leveraged also creates an information is presented in a truthful and ethical
opportunity to counter the message in kind. For manner, understanding the effects of cognitive
example, if authority bias is being leveraged (e.g., biases can help to make information transmission
through the use of an expert speaker), having more impactful. Conversely, understanding the
another recognized expert provide an alternative extent to which our own understanding can be
viewpoint can help to neutralize the effect. manipulated can make us less vulnerable to
Knowledge of the biases being invoked might adversary manipulation and disinformation
similarly suggest a path to mitigation of adversary campaigns.

9 Note that the ultimate attribution error can also be invoked during message encoding, which will affect how a sender frames his
or her message (e.g., engaging in the “blame and shame game” for perceived outsiders in the face of a negative event that—while
clearly caused by the outgroup—may have been inadvertent or atypical).
Cognitive Biases 10

References
Abi-Habib, M. (2018, June 25). How China got Sri Lanka to cough up a port. The New York Times.
https://www.nytimes.com/2018/06/25/world/asia/china-sri-lanka-port.html
Baboi, I. (2019, May 27). Bridges, roads and debt traps? China’s “Balkan Silk Road.” Human Security Centre.
http://www.hscentre.org/europe/bridges-roads-debt-traps-chinas-balkan-silk-road/
BBC HistoryExtra. (2018). The Charge of the Light Brigade: Who blundered in the Valley of Death?
HistoryExtra. https://www.historyextra.com/period/victorian/the-charge-of-the-light-brigade-
who-blundered-in-the-valley-of-death/
Brewer, P. R. (2006). National interest frames and public opinion about world affairs. Harvard International
Journal of Press/Politics, 11(4), 89–102.
Brickman, P., Coates, D., & Janoff-Bulman, R. (1978). Lottery winners and accident victims: Is happiness
relative? Journal of Personality and Social Psychology, 36(8), 917.
Chaiken, S. (1987). The heuristic model of persuasion. Social Influence: The Ontario Symposium, 5, 3–39.
https://books.google.com/books?hl=en&lr=&id=eGMAAwAAQBAJ&oi=fnd&pg=PA3&dq=The+heu
ristic+model+of+persuasion&ots=iwBkWABrs_&sig=OIe1S3JTveciwGF004NsLtZKeL0#v=onepage&
q=The%20heuristic%20model%20of%20persuasion&f=false
DeAeth, D. (2018, December 27). China’s African debt-trap: Beijing prepar... Taiwan News.
https://www.taiwannews.com.tw/en/news/3605624
Dimitrov, M. (2019, February 11). China’s Influence in Balkans Poses Risks, Report Warns. Balkan Insight.
https://balkaninsight.com/2019/02/11/chinas-influence-in-balkans-poses-risks-report-warns/
Dixon, P. (2019). Understand Your Brain: For a Change. OBI Press.
Eagly, A. H., & Chaiken, S. (1993). The psychology of attitudes. Harcourt brace Jovanovich college publishers.
Fernholz, T. (2018, March 7). Eight countries in danger of falling into China’s “debt trap.” Quartz.
https://qz.com/1223768/china-debt-trap-these-eight-countries-are-in-danger-of-debt-overloads-
from-chinas-belt-and-road-plans/
Fiske, S. T., & Taylor, S. E. (1991). Social cognition. Mcgraw-Hill Book Company.
Frenkel, S., & Benner, K. (2018, February 17). To stir discord in 2016, Russians turned most often to
Facebook. The New York Times. https://www.nytimes.com/2018/02/17/technology/indictment-
russian-tech-facebook.html
Gazzaniga, M. S., Ivry, R. B., & Mangun, G. R. (2002). Cognitive Neuroscience: The Biology of the Mind
(Second Edition). W.W. Norton & Company Inc.
Gigerenzer, G. (2008). Why heuristics work. Perspectives on Psychological Science, 3(1), 20–29.
Gigerenzer, G., Hertwig, R. E., & Pachur, T. E. (2011). Heuristics: The foundations of adaptive behavior.
Oxford University Press.
Gigerenzer, G., & Todd, P. M. (1999). Fast and frugal heuristics: The adaptive toolbox. In Simple heuristics
that make us smart (pp. 3–34). Oxford University Press.
Gilbert, D. T., Pelham, B. W., & Krull, D. S. (1988). On cognitive busyness: When person perceivers meet
persons perceived. Journal of Personality and Social Psychology, 54(5), 733.
Gilovich, T. D., & Griffin, D. W. (2010). Judgment and decision making. In D. T. Gilbert & S. T. Fiske (Eds.),
The Handbook of Social Psychology. McGraw-Hill, New York.
Gilovich, T., Keltner, D., Chen, S., & Nisbett, R. E. (2019). Social Psychology (5th ed.). W. W. Norton &
Company.
Greenwald, A. G., & Pettigrew, T. F. (2014). With malice toward none and charity for some: Ingroup
favoritism enables discrimination. American Psychologist, 69(7), 669.
Cognitive Biases 11

Haselton, M. G., Bryant, G. A., Wilke, A., Frederick, D. A., Galperin, A., Frankenhuis, W. E., & Moore, T.
(2009). Adaptive rationality: An evolutionary perspective on cognitive bias. Social Cognition, 27(5),
733–763.
Hilbert, M. (2012). Toward a synthesis of cognitive biases: How noisy information processing can bias
human decision making. Psychological Bulletin, 138(2), 211.
Horowitz, J., & Alderman, L. (2017). Chastised by E.U., a Resentful Greece Embraces China’s Cash and
Interests—The New York Times. https://www.nytimes.com/2017/08/26/world/europe/greece-
china-piraeus-alexis-tsipras.html
Kahneman, D. (2011). Thinking, fast and slow. Macmillan.
Kahneman, D., Krueger, A. B., Schkade, D., Schwarz, N., & Stone, A. A. (2006). Would you be happier if you
were richer? A focusing illusion. Science, 312(5782), 1908–1910.
Kahneman, D., & Tversky, A. (1984). Choices, values, and frames. American Psychologist, 39(4), 341.
Lam, K. C., Buehler, R., McFarland, C., Ross, M., & Cheung, I. (2005). Cultural differences in affective
forecasting: The role of focalism. Personality and Social Psychology Bulletin, 31(9), 1296–1309.
Levin, I. P. (1987). Associative effects of information framing. Bulletin of the Psychonomic Society, 25(2),
85–86.
Levin, I. P., Johnson, R. D., Russo, C. P., & Deldin, P. J. (1985). Framing effects in judgment tasks with varying
amounts of information. Organizational Behavior and Human Decision Processes, 36(3), 362–377.
Lieberman, M. D. (2007). The X-and C-systems: The neural basis of reflexive and reflective social cognition.
In E. Harmon-Jones & P. Winkelman (Eds.), Fundamentals of social neuroscience (pp. 290–315).
Guilford Press.
Nabi, R. L. (2003). Exploring the framing effects of emotion: Do discrete emotions differentially influence
information accessibility, information seeking, and policy preference? Communication Research,
30(2), 224–247.
Neale, M. A., & Northcraft, G. B. (1986). Experts, amateurs, and refrigerators: Comparing expert and
amateur negotiators in a novel task. Organizational Behavior and Human Decision Processes, 38(3),
305–317.
Nelson, T. E., Oxley, Z. M., & Clawson, R. A. (1997). Toward a psychology of framing effects. Political
Behavior, 19(3), 221–246.
Nisbett, R. E., & Miyamoto, Y. (2005). The influence of culture: Holistic versus analytic perception. Trends
in Cognitive Sciences, 9(10), 467–473.
Peer, E., & Gamliel, E. (2012). Estimating time savings: The use of the proportion and percentage heuristics
and the role of need for cognition. Acta Psychologica, 141(3), 352–359.
Pettigrew, T. F. (1979). The ultimate attribution error: Extending Allport’s cognitive analysis of prejudice.
Personality and Social Psychology Bulletin, 5(4), 461–476.
Pettigrew, T. F. (2001). The ultimate attribution error: Extending Allport’s cognitive analysis of prejudice. In
Intergroup relations: Essential readings (pp. 162–173). Psychology Press.
Petty, R. E., & Cacioppo, J. T. (1986). The elaboration likelihood model of persuasion. In Communication and
persuasion (Vol. 19, pp. 1–24). ADVANCES IN EXPERIMENTAL SOCIAL PSYCHOLOGY.
Quattrone, G. A., & Tversky, A. (1988). Contrasting rational and psychological analyses of political choice.
The American Political Science Review, 719–736.
Stanovich, K. E. (1999). Who is rational? Studies of individual differences in reasoning. Psychology Press.
Stewart, E. (2018, May 13). Most Russian Facebook ads sought to divide Americans on race. Vox.
https://www.vox.com/policy-and-politics/2018/5/13/17349670/facebook-russia-ads-race-house-
democrats
Tajfel, H. (1982). Social psychology of intergroup relations. Annual Review of Psychology, 33(1), 1–39.
Tajfel, H., & Turner, J. (1986). The social identity theory of inter-group behavior. In S. Worchel & L. W. Austin
(Eds.), Psychology of Intergroup Relations. Nelson-Hall.
Cognitive Biases 12

Tversky, A., & Kahneman, D. (1981). The framing of decisions and the psychology of choice. Science,
211(4481), 453–458.
Tversky, A., & Kahneman, D. (1986). Judgment under uncertainty: Heuristics and biases. Judgment and
Decision Making: An Interdisciplinary Reader, 38–55.
Wilson, T. D., Wheatley, T., Meyers, J. M., Gilbert, D. T., & Axsom, D. (2000). Focalism: A source of durability
bias in affective forecasting. 39.
Wong, E., Rosenberg, M., & Barnes, J. E. (2020, April 22). Chinese agents helped spread messages that
sowed virus panic in U.S., officials say. The New York Times.
https://www.nytimes.com/2020/04/22/us/politics/coronavirus-china-disinformation.html

Appendix A: Cognitive Bias Definitions and Sample References


Cognitive Bias Definition
Anchoring and adjustment The tendency for people to rely on initial information “anchors” that influence
heuristic subsequent judgments and interpretations.1 2
Authority bias The tendency to assume that the opinions of an authority figure are more accurate
(unrelated to their content), and subsequently be more influenced by these
opinions.3
Availability heuristic The tendency for people to seek out information that comes readily to mind when
making judgments about the frequency or probability of future events.4
Backfire effect The tendency for people to react to disconfirming evidence by strengthening their
prior beliefs.5 6 7 8 9 10
Bandwagon effect The tendency for people to do or believe things because many others do or believe
the same (i.e., to “jump on the bandwagon”).11 12 13 14
Base rate fallacy The tendency for people to ignore relevant statistical information when making
(aka base rate neglect) assessments about the frequency or likelihood of events (i.e., ignore base rate
information).15 16 17 18
Belief bias The tendency for people to accept or reject a conclusion based on how consistent it
(aka continued influence is with their everyday knowledge or how “believable” that conclusion is.19 20 21 22 23
effect, Semmelweiss reflex)

Belief in a just world The tendency for people to believe that the world is a fair and just place, where
other people get what they deserve in life.24 25
Belief perseverance The tendency for people to continue believing previously learned (mis)information
even after their initial beliefs have been corrected, effectively rejecting any new or
contradictory information.26 27
Confirmation bias The tendency for people to search for and focus on information or evidence that
supports a pre-existing belief or hypothesis, and give this evidence greater credence
than information that would disconfirm this belief.28 29
Curse of knowledge The tendency for better-informed people to find it difficult or impossible to think
(aka mindblindness) about a situation from the perspective of someone who is not privy to the same
information or knowledge.30 31

Endowment effect The tendency of people to value things that they own (i.e., things that become part
of the person’s endowment) more positively than they would if they did not own
them.32 33 34
Cognitive Biases
13

False consensus effect The tendency for people to overestimate how common their opinions are in the
general population and therefore the degree to which others agree with them.35
Focusing effect The tendency for people to place too much emphasis on one aspect of an event or
(aka focalism) issue, while neglecting other potentially important information.36 37 38
Framing effect The tendency for our choices and judgments to be influenced by the way these
choices are presented using different ordering, wording, or situations.39 40 41 42
Fundamental attribution The tendency for people to overestimate the degree to which other people's
error (aka correspondence behavior is caused by internal or dispositional factors, and to underestimate the
bias) degree to which situational or external factors play a role.43 44

Halo effect The tendency for people to assume that attractive individuals have a range of other
positive qualities beyond their physical appearance.45 46
Hostile attribution bias The tendency for people to interpret others' behaviors as being caused by hostile
intentions, even if the behaviors in question are benign or ambiguous.47
Hot-cold empathy gap The tendency for people to underestimate the influence that their emotions have on
their decisions and behaviors, while overestimating the role of cognition.48 49
Hot hand fallacy The tendency for people to see statistically unrelated (i.e., random sequences of)
events as being connected (such as a string of heads on multiple coin flips or making
several baskets or goals in a row), in turn believing that the streak or “hot hand” will
continue.50
Hyperbolic discounting The tendency for people to frequently prioritize near-term benefits over future
gains when making decisions.51 52 53
Identifiable victim effect The tendency for people to be more moved by the vivid plight of a single individual
(aka compassion fade) than they are by the less imaginable situations of a greater number of people. 54 55 56
57 58

Illusion of control The tendency for people to believe that they have control over random events or
(aka illusory control) events over which they are in actuality powerless.59 60 61
Illusory correlation The tendency for people to perceive a relationship or correlation where none
actually exists, therefore assuming that two events or characteristics are related
when they are not.62
Illusory superiority The tendency for most people to believe that they are above average on a wide
(aka Lake Wobegon effect, variety of personality, trait, and ability dimensions.63 64
better-than-average effect,
superiority bias)

Ingroup-outgroup bias The tendency for people to think in terms of rigid “us versus them” categorization
(aka ingroup favoritism, and treat ingroup members in a preferential way relative to outgroup members.65 66
67
ingroup bias,
intergroup bias)

Mere exposure effect The tendency for people, upon repeated exposures to something, to come to like it
(aka familiarity principle) more, resulting in a preference for familiar objects or people.68 69
Naïve realism The tendency for people to believe that they see the world in an objective and
unbiased way (i.e., to see reality as it really is), that rational people will agree with
this perception of the world, and that those who do not agree are either irrational,
uninformed, or biased.70
Cognitive Biases
14

Negativity bias The tendency for people’s psychological states to be more greatly influenced by
things of a negative nature than by things that are generally positive, even when the
negative and positive things are equal in number or proportion.71 72 73

Optimism bias The tendency for people to overestimate the likelihood that they will have favorable
(aka unrealistic optimism, future outcomes and to underestimate the likelihood that they will have
positive outcome bias) unfavorable future outcomes.74 75 76 77 78

Pluralistic ignorance The tendency for people to misperceive a group norm when they observe others
acting at variance with their private beliefs out of a concern for the social
consequences, which increases the likelihood that perceivers themselves will engage
in the same behaviors, thereby reinforcing the erroneous group norm.79 80 81
Reactance The tendency for people to—when they feel that their freedom to engage in a
specific behavior is constrained—feel an unpleasant state of resistance, which they
can reduce by engaging in the prohibited behavior.82
Reactive devaluation The tendency for people to be more likely to devalue, and therefore reject, an idea
or proposal if it comes from an opposing group or perceived outgroup than when it
comes from an ingroup member or members.83 84 85
Salience bias The tendency for people to focus on more prominent information, to the exclusion
(aka perceptual salience) of other potentially relevant information, creating a bias in favor of things that are
easily perceptible and vivid.86 87 88 89 90
Status quo bias The tendency for people to prefer that things stay relatively the same, resulting in a
preference for the current or default choice relative to other alternatives.91 92 93

Subjective validation The tendency for people to judge a statement or piece of information as being valid
(aka personal validation if it is personally meaningful to them.94 95 96
effect, Barnum effect, Forer
effect)
Ultimate attribution error The tendency for people to interpret the negative behavior of outgroup members as
being due to their character, and the positive behavior of outgroup members as
being due to external or circumstantial causes.97
Zero sum bias The tendency for people to erroneously perceive a situation as being zero-sum (i.e.,
one where one person or side can gain only at the expense of another).98

1
Epley, N., & Gilovich, T. (2006). The anchoring-and-adjustment heuristic: Why the adjustments are
insufficient. Psychological Science, 17(4), 311-318.
2
Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185,
1124–1130.
3
Milgram, S. (1963). Behavioral study of obedience. Journal of Abnormal Psychology, 67 (4), 371–378.
4
Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability.
Cognitive Psychology, 5(2), 207–32.
5
Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political
Behavior, 32, 303–330.
6
Nyhan, B., & Reifler, J. (2015). Does correcting myths about the flu vaccine work? An experimental
evaluation of the effects of corrective information. Vaccine, 33(3), 459–464.
Cognitive Biases
15

7
Peter, C., & Koch, T. (2016). When debunking scientific myths fails (and when it does not): The backfire
effect in the context of journalistic coverage and immediate judgments as prevention strategy. Science
Communication, 38(1), 3-25.
8
Trevors, G. J., Muis, K. R., Pekrun, R., Sinatra, G. M., & Winne, P. H. (2016). Identity and epistemic emotions
during knowledge revision: A potential account for the backfire effect. Discourse Processes, 53(5-6), 339-
370.
9
Cf: Haglin, K. (2017). The limitations of the backfire effect. Research & Politics, 4(3), 1-5.
10
Cf: Wood, T., Porter, E. (2019). The elusive backfire effect: Mass attitudes’ steadfast factual adherence.
Political Behavior, 41, 135–163.
11
Leibenstein, H. (1950). Bandwagon, snob, and veblen effects in the theory of consumers' demand. The
Quarterly Journal of Economics, 64(2), 183–207.
12
Myers, D. G., Wojcicki, S. B., & Aardema, B. S. (1977). Attitude comparison: Is there ever a bandwagon
effect? Journal of Applied Social Psychology, 7(4), 341-347.
13
Nadeau, R., Cloutier, E., & Guay, J. H. (1993). New evidence about the existence of a bandwagon effect
in the opinion formation process. International Political Science Review, 14(2), 203-213.
14
Zech, C. E. (1975). Leibenstein's bandwagon effect as applied to voting. Public Choice, 21, 117-122.
15
Allen, M., Preiss, R. W., Gayle, B. M. (2006). Meta-analytic examination of the base-rate fallacy.
Communication Research Reports, 23(1), 45-51.
16
Kahneman. D. & Tversky, A. (1973). On the psychology of prediction. Psychological Review, 80, 237-251.
17
Locksley, A., Hepburn, C., & Ortiz, V. (1982). Social stereotypes and judgments of individuals: An instance
of the base-rate fallacy. Journal of Experimental Social Psychology, 18(1), 23-42.
18
Cf: Koehler, J. J. (1996). The base rate fallacy reconsidered: Descriptive, normative, and methodological
challenges. Behavioral and Brain Sciences, 19(1), 1-17.
19
Cherubini, P., Garnham, A., Oakhill, J., & Morley, E. (1998). Can any ostrich fly? Some new data on belief
bias in syllogistic reasoning. Cognition, 69(2), 179-218.
20
Goel, V. & Dolan, R. J. (2003). Explaining modulation of reasoning by belief. Cognition, 87(1), B11-B22.
21
Klauer, K. C., Musch, J., Naumer, B. (2000). On belief bias in syllogistic reasoning. Psychological Review,
107(4), 852–84.
22
Markovits, H., & Nantel, G. (1989). The belief-bias effect in the production and evaluation of logical
conclusions. Memory and Cognition, 17(1), 11-17.
23
Roberts, M. J. & Sykes, E. D. (2003). Belief bias and relational reasoning. The Quarterly Journal of
Experimental Psychology Section A, 56(1), 131-154.
24
Lerner, M. J., & Miller, D. T. (1978). Just world research and the attribution process: Looking back and
ahead. Psychological Bulletin, 85(5), 1030–1051.
25
Lerner M.J. (1980) The Belief in a Just World. In: The Belief in a Just World. Perspectives in Social
Psychology. Springer, Boston, MA.
26
Johnson, H. M., & Seifert, C. M. (1994). Sources of the continued influence effect: When misinformation
in memory affects later inferences. Journal of Experimental Psychology: Learning, Memory, and
Cognition, 20(6). 1420–1436.
27
Ross, L., Lepper, M. R., & Hubbard, M. (1975). Perseverance in self-perception and social perception:
Biased attributional processes in the debriefing paradigm. Journal of Personality and Social Psychology,
32(5), 880–892.
28
Klayman, J. & Ha, Y. (1987). Confirmation, disconfirmation, and information in hypothesis testing.
Psychological Review, 94(20), 211-228.
29
Skov, R. B., & Sherman, S. J. (1986). Information gathering processes: Diagnosticity, hypothesis-
confirmatory strategies, and perceived hypothesis confirmation. Journal of Experimental Social
Psychology, 22, 93-121.
Cognitive Biases
16

30
Birch, S. A. J. & Bloom, P. (2007). The curse of knowledge in reasoning about false beliefs. Psychological
Science, 18(5), 382-386.
31
Camerer, C., Loewenstein, G., & Weber, M. (1989). The curse of knowledge in economic settings: An
experimental analysis. The Journal of Political Economy, 97 (5), 1232-1254.
32
Kahneman, D., Knetsch, J. L., Thaler, R.H. (1991). Anomalies: The endowment effect, loss aversion, and
status quo bias. The Journal of Economic Perspectives, 5(1), 193–206.
33
Kahneman, D., Knetsch, J. L., & Thaler, R. H. (1990). Experimental tests of the endowment effect and the
Coase Theorem. Journal of Political Economy, 98(6), 1325-1348.
34
Thaler, R. H. (1980). Toward a positive theory of consumer choice. Journal of Economic Behavior and
Organization, 1, 39-60.
35
Marks, G., & Miller, N. (1987). Ten years of research on the false-consensus effect: An empirical and
theoretical review. Psychological Bulletin, 102(1): 72–90.
36
Brickman, P., Coates, D., & Janoff-Bulman, R. (1978). Lottery winners and accident victims: Is happiness
relative? Journal of Personality and Social Psychology, 36(8), 917–927.
37
Kahneman, D., Krueger, A. B., Schkade, D., Schwarz, N., Stone, A. A. (2006). Would you be happier if you
were richer? A focusing illusion. Science, 312(5782), 1908–10.
38
Wilson, T., Wheatley, T., Meyers, J. M., Gilbert, D. T., & Axsom, D. (2000). Focalism: A source of durability
bias in affective forecasting. Journal of Personality and Social Psychology, 78(5), 821-836.
39
Druckman, J. (2001a). Evaluating framing effects. Journal of Economic Psychology, 22, 96–101.
40
Kahneman, D., & Tversky, A. (1979). Prospect theory: An analysis of decision under risk. Econometrica,
47, 263-291.
41
Levin, I. P., Schneider, S. L., & Gaeth, G. J. (1998). All frames are not created equal: A typology and critical
analysis of framing effects. Organizational Behavior and Human Decision Processes, 76, 149-188.
42
Tversky, A. & Kahneman, D. (1981). The framing of decisions and the psychology of choice. Science,
211(4481), 453–58.
43
Ross, L. D., Amabile, T. M. & Steinmetz, J. L. (1977). Social roles, social control, and biases in social-
perception processes. Journal of Personality and Social Psychology, 35(7), 485–94.
44
Cf: Ji, L.-J., Peng, K., & Nisbett, R. E. (2000). Culture, control, and perception of relationships in the
environment. Journal of Personality and Social Psychology, 78(5), 943–955.
45
Nisbett, R. E. & Wilson, T. D. (1977). The halo effect: Evidence for unconscious alteration of judgments.
Journal of Personality and Social Psychology, 35(4), 250-256.
46
Thorndike, E. L. (1920). A constant error in psychological ratings. Journal of Applied Psychology, 4(1), 25-
29.
47
Anderson, K. B., Graham, L.M. (2007). Hostile attribution bias. Encyclopedia of Social Psychology. SAGE
Publications, Inc. pp. 446–447.
48
Loewenstein, G. (2005). Hot-cold empathy gaps and medical decision making. Health Psychology, 24(4),
S49-S56.
49
Sayette, M. A., Loewenstein, G., Griffin, K., & Black, J. J. (2008). Exploring the cold-to-hot empathy gap in
smokers. Psychological Science, 19(9), 926-932.
50
Gilovich, T., Tversky, A. & Vallone, R. (1985). The hot hand in basketball: On the misperception of random
sequences. Cognitive Psychology, 17, 295-314.
51
Ainslie, G., & Haslam, N. (1992). Hyperbolic discounting. In G. Loewenstein & J. Elster (Eds.), Choice over
time (pp. 57–92).
52
Laibson, D. (1997). Golden eggs and hyperbolic discounting. The Quarterly Journal of Economics, 112(2),
443–478.
53
Rubinstein, A. (2003). Economics and psychology? The case of hyperbolic discounting. International
Economic Review, 44(4), 1207-1216.
Cognitive Biases
17

54
Collins, R. L., Taylor, S. E., Wood, J. V. & Thompson, S. C. (1988). The vividness effect: Elusive or illusory?
Journal of Experimental Social Psychology, 24, 1-18.
55
Jennil, K. E., & Loewenstein, G., (1997). Explaining the Identifiable Victim Effect. Journal of Risk and
Uncertainty, 14, 235–257.
56
Shedler, J., & Manis., M. (1986). Can the availability heuristic explain vividness effects? Journal of
Personality and Social Psychology, 51(1), 26–36.
57
Smalla, D. A., Loewenstein, G., & Slovic, P. (2007). Sympathy and callousness: The impact of deliberative
thought on donations to identifiable and statistical victims. Organizational Behavior and Human Decision
Processes, 102, 143–153.
58
Västfjäll D., Slovic, P., Mayorga, M., & Peters, E. (2014). Compassion fade: Affect and charity are greatest
for a single child in need. PLOS ONE, 9 (6): e100115.
59
Langer, E. J. (1975). The illusion of control. Journal of Personality and Social Psychology, 32(2), 311–328.
60
McKenna, F. P. (1993). It won't happen to me: Unrealistic optimism or illusion of control? British Journal
of Psychology, 84(1), 39-50.
61
Presson, P. K., & Benassi, V. A. (1996). Illusion of control: A meta-analytic review. Journal of Social
Behavior and Personality, 11(3), 493.
62
Chapman, L. J. & Chapman, J. (1967). Genesis of popular but erroneous diagnostic observations. Journal
of Abnormal Psychology, 72, 193-204.
63
Alicke, M. D., Govorum, O. (2005). The better than average effect. In M. D. Alicke, D. A. Dunning, & J.
Krueger (Eds.), The Self in Social Judgment. New York: Taylor & Francis Group.
64
Hoorens, V. (1993). Self-enhancement and Superiority Biases in Social Comparison. European Review of
Social Psychology, 4(1), 113–139.
65
Greenwald, A., & Pettigrew, T. (2014). With malice toward none and charity for some: Ingroup favoritism
enables discrimination. American Psychologist, 69(7),669–684.
66
Tajfel, H. (1982). Social psychology of intergroup relations. Annual Review of Psychology, 33, 1-39.
67
Tajfel, H., & Turner, J. C. (1986). The social identity theory of intergroup behavior. In S. Worchel & W. G.
Austin (Eds.), Psychology of Intergroup Relations. Chicago: Nelson-Hall.
68
Bornstein, R. F. (1989). Exposure and affect: Overview and meta-analysis of research, 1968–1987.
Psychological Bulletin, 106(2), 265-289.
69
Zajonc, R. B. (1968). Attitudinal effects of mere exposure. Journal of Personality and Social Psychology,
9(2), 1-27.
70
Ross, L. & Ward, A. (1997). Naive realism in everyday life: Implications for social conflict and
misunderstanding. In A. Ward, L. Ross, E Reed, E Turiel (Eds.), Values and Knowledge.
71
Ito, T. A., Larsen, J. T., Smith, N. K., & Cacioppo, J. T. (1998). Negative information weighs more heavily
on the brain: The negativity bias in evaluative categorizations. Journal of Personality and Social
Psychology, 75(4), 887–900.
72
Kahneman, D., & Tversky, A. (1979). Prospect theory: An analysis of decision under risk. Econometrica,
47, 263-291.
73
Tversky, A. & Kahneman, D. (1992). Advances in prospect theory: Cumulative representation of
uncertainty. Journal of Risk and Uncertainty, 5, 297-323
74
Bracha, A., & Brown, D. J. (2012). Affective decision making: A theory of optimism bias. Games and
Economic Behavior,75(1), 67-80
75
McKenna, F. P. (1993). It won't happen to me: Unrealistic optimism or illusion of control? British Journal
of Psychology, 84(1), 39-50.
76
Sharot, T. (2011). The optimism bias. Current Biology, 21(3).
77
Slovic, P. (Ed.). (2000). Risk, society, and policy series. The perception of risk. Earthscan Publications.
78
Weinstein, N. D. (1980). Unrealistic optimism about future life events. Journal of Personality and Social
Psychology, 39(5), 806–820.
Cognitive Biases
18

79
Miller, D. T., & McFarland, C. (1987). Pluralistic ignorance: When similarity is interpreted as dissimilarity.
Journal of Personality and Social Psychology, 53(2), 298–305.
80
Prentice, D. A. & Miller, D. T. (1996). Pluralistic ignorance and the perpetuation of social norms by
unwitting actors. Advances in Experimental Social Psychology, 28, 161-209.
81
Shelton, J. N. & Richeson, J. A. (2005). Journal of Personality and Social Psychology, 88(1), 91-107.
82
Brehm, S. S. & Brehm, J. (1981). Psychological reactance: A theory of freedom and control. New York:
Academic Press Inc.
83
Bruneau, E. (2015) Putting neuroscience to work for peace. In: E. Halperin and K. Sharvit (Eds.) The Social
Psychology of Intractable Conflicts. Peace Psychology Book Series (Vol 27). Cambridge, MA: Springer.
84
Ross, L. (1995). Reactive devaluation in negotiation and conflict resolution. In K. J. Arrow (Ed.), Barriers
to conflict resolution (1st ed.). New York: W.W. Norton.
85
Ross, L. & Stillinger, C. (1991), Barriers to conflict resolution. Negotiation Journal, 7, 389–404.
86
Bordalo, P., Gennaioli, N., & Shleifer, A. (2012). Salience theory of choice under risk. The Quarterly Journal
of Economics 127(3) 1243–1285.
87
Defetyer, M. A., Russo, R., & McPartlin, P. L. (2009). The picture superiority effect in recognition memory:
a developmental study using the response signal procedure. Cognitive Development, 24 (3): 265–273.
88
Kahneman, D., Slovic, P., & Tversky, A. (1982). Judgment under uncertainty: Heuristics and biases.
Cambridge, UK: Cambridge University Press.
89
Whitehouse, A. J., Maybery, M. T., Durkin, K. (2006). The development of the picture-superiority effect.
British Journal of Developmental Psychology, 24(4): 767–773.
90
Cf: Taylor, S. E., & Thompson, S. C. (1982). Stalking the elusive "vividness" effect. Psychological Review,
89(2), 155–181.
91
Samuelson, W. & Zeckhauser, R. (1988). Status quo bias in decision making. Journal of Risk and
Uncertainty, 1, 7–59.
92
Kahneman, D, Knetsch, J. L., Thaler, R. H. (1991). Anomalies: the endowment effect, loss aversion, and
status quo bias. The Journal of Economic Perspectives, 5(1): 193–206.
93
Masatlioglu, Y., & Efe, A. O. (2005). Rational choice with status quo bias. Journal of Economic Theory,
121(1), 1-29.
94
Dickson, D. H. & Kelly, I. W. (1985). The ‘Barnum Effect’ in personality assessment: a review of the
literature. Psychological Reports, 57(2), 367-382.
95
Forer, B. R. (1949). The fallacy of personal validation: a classroom demonstration of gullibility. The Journal
of Abnormal and Social Psychology, 44(1), 118–123.
96
Glick, P., Gottesman, D., & Jolton, J. (1989). The fault is not in the stars: susceptibility of skeptics and
believers in astrology to the Barnum Effect. Personality and Social Psychology Bulletin, 15(4), 572-583.
97
Pettigrew, T. F. (1979). The ultimate attribution error: extending Allport's cognitive analysis of prejudice.
Personality and Social Psychology Bulletin, 5(4), 461–476.
98
Meegan, D. V. (2010). Zero-sum bias: perceived competition despite unlimited resources. Frontiers in
Psychology, 1(191), 1-7.

You might also like