Instant download A Multidisciplinary Framework of Information Propagation Online Susannah B. F. Paletz pdf all chapter
Instant download A Multidisciplinary Framework of Information Propagation Online Susannah B. F. Paletz pdf all chapter
Instant download A Multidisciplinary Framework of Information Propagation Online Susannah B. F. Paletz pdf all chapter
com
https://textbookfull.com/product/a-multidisciplinary-
framework-of-information-propagation-online-susannah-b-f-
paletz/
OR CLICK BUTTON
DOWNLOAD NOW
https://textbookfull.com/product/multidisciplinary-management-of-
gastroesophageal-reflux-disease-sebastian-f-schoppmann/
textboxfull.com
https://textbookfull.com/product/sarcoma-a-practical-guide-to-
multidisciplinary-management-peter-f-m-choong/
textboxfull.com
https://textbookfull.com/product/information-asymmetry-in-online-
advertising-1st-edition-wiktor/
textboxfull.com
https://textbookfull.com/product/1001-ways-to-stay-young-naturally-
susannah-marriott/
textboxfull.com
https://textbookfull.com/product/sound-propagation-through-the-
stochastic-ocean-1st-edition-john-a-colosi/
textboxfull.com
https://textbookfull.com/product/ethics-of-digital-well-being-a-
multidisciplinary-approach-christopher-burr/
textboxfull.com
https://textbookfull.com/product/digital-detectives-solving-
information-dilemmas-in-an-online-world-1st-edition-fulton/
textboxfull.com
SPRINGER BRIEFS IN COMPLEXIT Y
Susannah B. F. Paletz
Brooke E. Auxier
Ewa M. Golonka
A Multidisciplinary
Framework of
Information
Propagation
Online
123
SpringerBriefs in Complexity
Series Editors:
Henry D. I. Abarbanel, University of California, Institute for Nonlinear Science,
La Jolla, CA, USA
Dan Braha, New England Complex Systems Institute,
University of Massachusetts, North Dartmouth, MA, USA
Péter Érdi, Department of Physics, Center for Complex Systems Studies,
Kalamazoo College, Kalamazoo, MI, USA
Karl J Friston, University College London, Institute of Cognitive Neuroscience,
London, UK
Hermann Haken, University of Stuttgart, Center of Synergetics,
Stuttgart, Germany
Viktor Jirsa, Université de la Méditerranée, Centre National de la Recherche
Scientifique (CNRS), Marseille, France
Janusz Kacprzyk, Systems Research Institute, Polish Academy of Sciences,
Warsaw, Poland
Kunihiko Kaneko, Research Center for Complex Systems Biology,
The University of Tokyo, Tokyo, Japan
Scott Kelso, Florida Atlantic University, Center for Complex Systems and Brain
Sciences, Boca Raton, FL, USA
Markus Kirkilionis, Mathematics Institute and Centre for Complex Systems,
University of Warwick, Coventry, UK
Jürgen Kurths, University of Potsdam, Nonlinear Dynamics Group,
Potsdam, Brandenburg, Germany
Ronaldo Menezes, Department of Computer Science, University of Exeter,
Exeter, UK
Andrzej Nowak, Department of Psychology, Warsaw University,
Warszawa, Poland
Hassan Qudrat-Ullah, King Fahd University of Petroleum and Minerals,
Dhahran, Saudi Arabia
Peter Schuster, University of Vienna, Vienna, Austria
Frank Schweitzer, ETH Zurich, System Design, Zürich, Switzerland
Didier Sornette, ETH Zurich, Entrepreneurial Risk, Zürich, Switzerland
Stefan Thurner, Section for Science of Complex System,
Medical University of Vienna, Vienna, Austria
Linda Reichl, University of Texas, Center for Complex Quantum Systems,
Austin, TX, USA
SpringerBriefs in Complexity are a series of slim high-quality publications
encompassing the entire spectrum of complex systems science and technology.
Featuring compact volumes of 50 to 125 pages (approximately 20,000–45,000
words), Briefs are shorter than a conventional book but longer than a journal article.
Thus Briefs serve as timely, concise tools for students, researchers, and professionals.
Typical texts for publication might include:
• A snapshot review of the current state of a hot or emerging field
• A concise introduction to core concepts that students must understand in order to
make independent contributions
• An extended research report giving more details and discussion than is possible
in a conventional journal article,
• A manual describing underlying principles and best practices for an experimen-
tal or computational technique
• An essay exploring new ideas broader topics such as science and society
Briefs allow authors to present their ideas and readers to absorb them with
minimal time investment. Briefs are published as part of Springer’s eBook collection,
with millions of users worldwide. In addition, Briefs are available, just like books,
for individual print and electronic purchase. Briefs are characterized by fast, global
electronic dissemination, straightforward publishing agreements, easy-to-use
manuscript preparation and formatting guidelines, and expedited production
schedules. We aim for publication 8-12 weeks after acceptance.
SpringerBriefs in Complexity are an integral part of the Springer Complexity
publishing program. Proposals should be sent to the responsible Springer editors or
to a member of the Springer Complexity editorial and program advisory board
(springer.com/complexity).
A Multidisciplinary
Framework of Information
Propagation Online
Susannah B. F. Paletz Brooke E. Auxier
Center for Advanced Study of Language Philip Merrill College of Journalism
University of Maryland University of Maryland
College Park, MD, USA College Park, MD, USA
Ewa M. Golonka
Center for Advanced Study of Language
University of Maryland
College Park, MD, USA
This Springer imprint is published by the registered company Springer Nature Switzerland AG.
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
To David Leon Paletz, who has conducted
multidisciplinary research in this area for
decades and who inspires us all to cross
boundaries with rigor and insight.
Acknowledgments
The authors are grateful to Erica Michael, Michael Maxwell, and Nikki Adams for
comments on a prior draft, to the CASL Researcher’s Forum attendees for helpful
comments during our presentation, to our families for their support and patience,
and to Rebecca Goolsby for her guidance on and funding of this project.
Funding/Support This material is based upon work supported, in whole or in part,
with funding from the United States Government Office of Naval Research grant
12398640. Any opinions, findings, and conclusions or recommendations expressed
in this material are those of the author(s) and do not necessarily reflect the views of
the University of Maryland, College Park, and/or any agency or entity of the United
States Government.
vii
Contents
1 Introduction���������������������������������������������������������������������������������������������� 1
1.1 Background �������������������������������������������������������������������������������������� 3
1.2 Overview of the Framework ������������������������������������������������������������ 5
2 Sources of Messages �������������������������������������������������������������������������������� 9
2.1 Originating Content�������������������������������������������������������������������������� 10
2.2 Receiving Content���������������������������������������������������������������������������� 10
2.3 Gathering Content���������������������������������������������������������������������������� 11
2.4 Summary of Sources of Messages���������������������������������������������������� 13
3 Reactions to the Message and Messenger���������������������������������������������� 15
3.1 Affect and Engagement�������������������������������������������������������������������� 16
3.1.1 High Arousal-Specific Emotions������������������������������������������ 17
3.1.2 Other Affective Engagement������������������������������������������������ 19
3.2 Cognition������������������������������������������������������������������������������������������ 20
3.2.1 Belief in the Original Message �������������������������������������������� 21
3.2.2 Other Cognitive Reactions Rendering Belief
Unnecessary�������������������������������������������������������������������������� 32
3.3 Both Affect and Cognition���������������������������������������������������������������� 33
3.4 Summary of Reactions by Genuine Users���������������������������������������� 35
4 Motivation to Share �������������������������������������������������������������������������������� 37
4.1 Needs������������������������������������������������������������������������������������������������ 38
4.1.1 Impression Management and Self-Enhancement����������������� 38
4.1.2 Self-Consistency Motives and Social Identity���������������������� 39
4.1.3 Accuracy ������������������������������������������������������������������������������ 40
4.1.4 Affiliation������������������������������������������������������������������������������ 41
4.2 Sociopolitical and Economic Motivations���������������������������������������� 43
4.3 Summary of Motivations������������������������������������������������������������������ 44
ix
x Contents
Index������������������������������������������������������������������������������������������������������������������ 103
Chapter 1
Introduction
Abstract As the use of social media platforms has increased, so have they become
a new domain in information warfare. Before tackling the roots or spread of misin-
formation or disinformation, it is important to understand why people share any
information on social media at all. This book presents a broad, multidisciplinary
review and creation of a theoretical framework of the factors that have been shown
to, or might, influence sharing information on social media, regardless of its verac-
ity. The act of sharing information online is made up of several categories of factors:
sources of messages, reactions to the original message and messenger, the motiva-
tion to share, the ability to share (and perception of the ability to share), and then, of
course, actual sharing behavior. In addition, while genuine actors may have reac-
tions to the original message and messenger, there also exist non-genuine actors that
have pre-programmed or pre-planned reactions. We also qualitatively examined 20
fake news stories in two different languages as they appeared in social media in
order to illustrate factors affecting information propagation and identify potential
gaps in the literature.
Social media are a relatively new channel by which people not only acquire, but
also share information. Social media networks have also been used to spread mis-
information, since at least 2010 on Twitter about the Democratic candidate Martha
Coakley to the 2016 U.S. presidential election (Mustafaraj & Metaxas, 2017). The
revelations of Russian disinformation campaigns on social media against the U.S.
population during the 2016 election have identified a stark vulnerability in the secu-
rity of the United States (e.g., Sydell, 2017; United States of America v. Internet
Research Agency LLC, 2018; Waltzman, 2017; Woolley & Howard, 2017). Fake
news, as it is currently popularized,1 is not new, nor is propaganda (e.g., Allcott &
Gentzkow, 2017; Jowett & O’Donnell, 2015; Lazer et al., 2018; McKernon, 1928;
Pratkanis & Aronson, 2001). However, there has been an increase in this kind of
‘information war’ by Russia against the United States and European countries at
least since 2014 (e.g., Paul & Matthews, 2016; Prier, 2017; Woolley & Howard,
2017). Information warfare does not simply occur between nations: Both jihadi
Islamic groups and right-wing extremists also recruit and spread propaganda online
(e.g., Benigni, Joseph, & Carley, 2017; Bowman-Grieve, 2013; Caiani &
Wagemann, 2009; Derrick, Sporer, Church, & Ligon, 2016; Prier, 2017; Vidino &
Hughes, 2015).
Social media-based information conflict has sparked the interest of government,
academia, and industry alike. Key research topics underlying the flood of online
communication are why and how people share narratives and information online.
Narratives, in this context, refer to coherent stories that are shared with multiple
people rather than isolated pieces of information (Green & Brock, 2005; Hinck,
Kluver, & Cooley, 2017). A narrative might describe an activity or conflict consist-
ing of a storyline with a beginning, middle, and end, rather than a single fact, and it
may imply or state a context, how, and why (van Krieken & Sanders, 2016). Readers
of news stories can be more involved in narrative reporting, for instance, compared
to more neutral, ‘hard news’ reporting (van Krieken & Sanders, 2016). This review
examines studies of information propagation, and often focuses on narratives. Given
its conceptual breadth, we also use the term “message” to include narratives.
Although narrative propagation is not new, social media has made spreading stories,
including false ones or ones with false elements, easier.
Before tackling the roots or spread of misinformation (incomplete, vague,
ambiguous information) or disinformation (intentionally untrue information;
Cooke, 2017), it is important to understand why people share any information on
social media at all. The propagation of information has many antecedents, causes,
and moderating factors, including amplifiers and suppressors. Information propaga-
tion has been studied for decades across a range of disciplines: psychology,
marketing, sociology and social network analysis, political science and political
communication, human-computer interaction (HCI), journalism, and information
sciences. This book leverages existing knowledge about the spread of information
in general, why people are convinced by information they see, the effects of differ-
ent kinds of messages on human affect and cognition, and what might make some-
one go from interest to sharing, as well as some possible cross-cultural differences.
Most individual articles focus on a small number of factors that might be useful,
1
Fake news historically has included yellow journalism and other information in news media that
is deliberately inaccurate or misleading. For most of this book, we use the term ‘fake news’ to refer
to “news articles that are intentionally and verifiably false, and could mislead readers” (Allcott &
Gentzkow, 2017, p. 213). Note that intent is hard to prove, and propaganda can include a mix of
falsehood and truth. Although we will refer to ’fake news,’ particularly in reference to the corpus
we analyze, we will often also refer to the more technical terms of misinformation and disinforma-
tion (Cooke, 2017). Fake news may include one or both types. While disinformation is intention-
ally untrue, misinformation may have elements of truth in it (Cooke, 2017).
1.1 Background 3
with few works attempting a comprehensive view (see Hermida, 2014, for one such
attempt). In addition to surveying the literature, we conducted a bottom-up, qualita-
tive analysis of 20 fake news stories shared via social media in English and Russian,
in terms of both language and social media users and platforms (Methodology in
Appendix A). We use insights from these stories to contextualize and illustrate the
findings from the literature, as well as to add to our theoretical framework. By
assembling many factors from across a wide literature, future research can develop
measures to quantify those factors, with the goal of discovering which ones have
predictive power for encouraging sharing behavior in conjunction with each other in
real-world social media and within different cultural and language contexts.
Thus, this work presents a broad, multidisciplinary review of the factors that
have been shown to, or might, influence sharing information on social media,
regardless of its veracity. This book begins with an introduction to the problem, then
covers the background and an overview of a high-level framework of information
sharing. The framework flows through the different ways information is acquired or
viewed from a source, to reactions by the target sharer and motivations to share, to
the ability (and perceptions of the ability) to share before leading to sharing behav-
ior. Finally, this framework distinguishes between genuine and non-genuine (inau-
thentic) actors: Non-genuine actors are individuals who are pretending to be
someone they are not, and can include bots, which are automated and driven by
algorithms designed to interact or share information.
1.1 Background
In the U.S., social media use has become synonymous with digital and mobile life.
A Pew Research Center study found that in November 2018, at least 69% of U.S.
adults used at least one social media site (Pew Research Center, Internet, &
Technology, 2018). This finding is a significant jump from November 2010, when
only 45% of American adults said the same. Among popular social media sites in
the U.S., including Facebook, Twitter, Instagram, LinkedIn and Pinterest, Facebook
remains the most popular, with 68% of U.S. adults using it and 74% of them visiting
the site daily (Pew Research Center, Internet, & Technology, 2018).
Social media platforms are proving popular destinations for news consumption,
specifically. In August 2017, 67% of Americans reported getting at least some of
their news on social media, with, again, Facebook leading the way (Shearer &
Gottfried, 2017). Forty-five percent of Facebook users say they get news on the site,
followed by YouTube (18%) and Twitter (11%; Shearer & Gottfried, 2017), though
about a quarter of Americans report getting news from two or more social media
sites.
Many users openly struggle with misinformation and disinformation. A Pew
study using 2016 survey data found that 23% of respondents said they had shared a
made-up news story—some knowingly, some unknowingly (N = 1002; Barthel,
Mitchell, & Holcomb, 2016). A larger percentage of respondents in this study, 64%,
4 1 Introduction
stated that fabricated news stories “cause a great deal of confusion about the basic
facts of current issues and events.” A different study of 1200 respondents found that
only 14% of American adults viewed social media as the most important source of
election news, but that all the respondents were exposed to at least one, or perhaps
several, fake news articles (Allcott & Gentzkow, 2017). Fake news websites relied
more heavily on social media traffic than true news, which relied on social media
traffic relatively less (42% versus 10%). Of concern, in a study of disconfirmed
rumors on Twitter, most misinformed users (86–91%) spread the false rumors rather
than expressing doubt about them or seeking confirmation of the rumor, and most
(78–96%) of the debunked-rumor-spreading users neither deleted nor clarified their
original posts (Wang & Zhuang, 2018).
Social media have made the cost of spreading information low, making it easy to
influence public opinion (Allcott & Gentzkow, 2017; Shao, Ciampaglia, Varol,
Flammini, & Menczer, 2017; Shifman, 2014). Opinions are expressed attitudes
(Glynn, Herbst, O’Keefe, Shapiro, & Linderman, 2004), whereas attitudes are a
mix of behavior, affect (feelings), and cognition (beliefs; Breckler, 1984; Glynn
et al., 2004). Attitudes may be overtly or implicitly contained within narratives,
including in how narratives are presented and framed. Social media are, of course,
not only used for sharing individual political opinions. Social media are used by
large companies to advertise; by smaller-scale creators and businesses to promote
themselves and cultivate an audience and fan base; by politicians to share directly
their thoughts with constituents and others; and by individuals to share news,
memes, and their lives with each other. On the one hand, social media serve as a
new ‘public sphere’ that enables minority and oppressed voices to gain an audience
by circumventing gatekeepers, thus enhancing free speech (Debatin, 2008; Shifman,
2014). On the other hand, social media platforms are designed to encourage shar-
ing, regardless of the authenticity or benevolence of the content, and thus support
the best and worst of human psychology: the needs for attention, affiliation, and
status; the desire to control and dominate narratives; and the attraction of novelty
(e.g., Hermida, 2014; Tufekci, 2018). Malicious actors, be they state-sponsored or
unaffiliated with a state, also use social media platforms to organize, spread their
narratives, recruit, disrupt, undermine, and outright harm (see Goolsby, 2013;
Goolsby, Galeano, & Agarwal, 2014; O’Sullivan, 2018; Paul & Matthews, 2016;
Prier, 2017; Sindelar, 2014; Tufekci, 2018; Vidino & Hughes, 2015; Waltzman,
2017; Woolley & Howard, 2017).
Social media therefore can suffer from a series of problems, including deliber-
ately false information, too much information, and malicious or hostile actions
(Goolsby, 2013; Tufekci, 2018). This hostility can affect individuals outside of the
digital environment, such as when individuals’ personal information is deliberately
posted online (called doxing) and malicious actors falsely report to law enforcement
an emergency, resulting in the deployment of a SWAT (Special Weapons and
Tactics) team to an unsuspecting person’s home (swatting; Tufekci, 2018). Even
beyond deliberate malicious actions, compared to traditional media, new media
encourage audiences to follow content they already agree with (Paletz, Owen, &
Cook, 2018). Citizen journalists (and those trying to be journalists) lack the
1.2 Overview of the Framework 5
resources to truly uncover the truth and are primarily reactive (Paletz et al., 2018).
In addition, most new media platforms are controlled by a few large corporations
(e.g., Facebook), with smaller social media organizations increasingly acquired by
larger ones (Arsenault & Castells, 2008; Paletz et al., 2018). Further, social media
raise a range of privacy concerns (Ellison, Vitak, Steinfield, Gray, & Lampe, 2011;
Paletz et al., 2018; Trepte & Reinecke, 2011). Not only are people on social media
publically sharing what used to be private word-of-mouth and rumor (Hermida,
2014), but social media are used to express and promote hatred, extremism, and
fanaticism, and “are rife with muddle and nonsense, distortion and error” (Paletz
et al., 2018, p. 27). With such downsides, one wonders why anyone remains engaged
in social media, let alone shares information. Understanding why people share
information online is key to sorting through the muddle (Paletz et al., 2018).
The virality, or widespread sharing, of messages themselves has been studied exten-
sively. One conceptual model of sharing in social media, based on successful viral
marketing campaigns, suggests four success factors with the acronym ‘SPIN’:
“spreadability of content based on personal factors, the propagativity of content
based on media type, the integration of multiple media platforms, and the succes-
sive reinforcement of messaging” (Mills, 2012, p. 166). Within this SPIN frame-
work, the spreadability refers to likeability of the content of the message and
whether the sharer feels others in the social network will have a similar reaction;
propagativity refers to the ease with which consumers can continue to distribute or
redistribute the content based on both qualities of the content itself and the initial
sharer’s social network; integration refers to the strategic use of multiple social
networks simultaneously; and although most content does not achieve this final
stage, nexus refers to the “successive reinforcement of the campaign by virtue of
sequentially releasing units of viral content” (Mills, 2012, p. 168). Our framework
incorporates the factors behind both spreadability and propagativity and goes
beyond Mills’ model to include insights from a range of disciplines. Related
research utilizes network analysis to examine the relationship between different
kinds of memes based on their content, form, and stance (Shifman, 2013, 2014),
rather than the social network of individuals sharing that information (e.g., Segev,
Nissenbaum, Stolero, & Shifman, 2015). This research suggests that when a meme
spreads and changes, it retains some of its essential and unique ‘hooks’ and generic
attributes (Segev et al., 2015).
The higher-level constructs in this framework draw from a range of disciplinary
theories. For example, from political communication, we take the different ways in
which media collect narratives (Paletz, 2002). From psychology, we draw on the
interplay and distinctions among cognition, affect, and behavior; the impact of cul-
ture on the three of those; the distinctions among different types of cognitive
processing; persuasion; and the importance of both individual differences and the
6 1 Introduction
social situation (see below, e.g., Ajzen, 1991; Breckler, 1984; Brock & Green, 2005;
Cialdini & Goldstein, 2004). This framework does not presume sharing, but instead
identifies factors that may encourage or discourage sharing and the potential inter-
actions among those factors.
Most of the components of this framework relate to the process of evaluation that
a person goes through, explicitly or implicitly, before making a decision to share
information. At the most abstract level, this evaluation involves information that
originates from somewhere, a psychological reaction on the part of a real person
engaged with that information (or a pre-programmed reaction by an entity that is not
genuine), a motivation to share, and then an assessment as to what kind of sharing
is possible and/or desirable. Thus, in our framework, the act of sharing information
online is made up of several categories of factors: sources of messages, reactions to
the original message and messenger, the motivation to share, the ability to share
(and perception of the ability to share), and then, of course, actual sharing behavior
(Fig. 1.1). Dotted lines indicate the general temporal flow of a process which may,
but need not, indicate direct causal relationships. Solid lines indicate potential
directional influence. For example, the sources of messages do not cause the same
reaction in all people, but the messages elicit some kind of psychological reaction
by genuine actors. That reaction is influenced by their dynamic motivations. This
high-level framework has much in common with fundamental psychological theory
that suggests that attitudes and norms feed into behavioral intentions that then feed
into behavior, dependent on the perceived ability to perform the behavior (e.g.,
Madden, Ellen, & Ajzen, 1992). In addition, we recognize that an increasing amount
of online activity is conducted by non-genuine actors (e.g., Arnsdorf, 2017;
O’Sullivan, 2018; Woolley & Howard, 2017). These include bots, or algorithms
designed to share (Lokot & Diakopoulous, 2016; Shao et al., 2017; Varol, Ferrara,
Genuine Actors
Reactions to the
Message and
Messenger
Sources of Ability to Sharing
Messages Share Behavior
Motivation to
Share Context and match
between context and
other model
components
Non-Genuine
Actors
Davis, Menczer, & Flammini, 2017) and/or sockpuppets, or multiple fake identities
through which individuals create the illusion of support (or disdain) by pretending
to be different people (Bu, Xia, & Wang, 2013). Sockpuppets can be bots or can be
humans who are hiding behind a false identity. As defined here, non-genuine actors
on social media do not react to original messages with authentic or real psychologi-
cal responses (i.e., with affective and/or cognitive processes). Instead, they are pro-
grammed to propagate messages based on preexisting and adaptive plans. That
noted, genuine human actors are behind the creation or goals of bots and sockpup-
pets, and they have specific motivations of their own (e.g., economic, political),
such that genuine actors’ motivations impact non-genuine actors. Finally, the
broader context and match between that context and the component pieces of this
framework, including culture and the sharer’s placement within their social net-
works, influence and touch all the other components. The highest level framework,
represented in Fig. 1.1, includes all of these components (see also Fig. B1 in
Appendix B for a detailed version of the framework).
Chapter 2
Sources of Messages
Abstract Messages originate from a variety of sources. Social media users may
create content, observe a message or narrative, or seek one out. What people view is
influenced by what they search for and what is being shared already in their social
networks.
commonly, social media users see information and narratives that those they follow
share, and then may decide to (or not to) share in turn. Finally, social media users
may actively seek out others’ sources, such as online news sources or blogs (web-
based public journal logs), and then gather them to share on Facebook, Twitter, or
some other platform.
Social media users can also receive content, viewing it on their social media pas-
sively, actively, and/or via the social media platform’s algorithms. In the study of
tweets during Typhoon Haiyan, the most common type of tweet (43.4%) was sec-
ondhand reporting of the disaster, or tweeting information that arose from some-
where else (Takahashi et al., 2015). As noted previously, two-thirds of Americans
2.3 Gathering Content 11
reported getting their news via social media (Shearer & Gottfried, 2017). Social
media users’ digital networks have a significant impact on what content they are
exposed to and what they consume. As they follow specific accounts of friends,
family members, celebrities, businesses, and media organizations on social media,
what those groups and individuals share dictates the content appearing in their
feeds. Homophily is a basic principle such that individuals who are similar tend to
connect with each other (McPherson, Smith-Lovin, & Cook, 2001). Individuals
who are similar to each other socioculturally, demographically, professionally, ideo-
logically, and so on will tend to join together into networks.
For some, this clustering may create an echo chamber of information. In echo
chambers, individuals are largely exposed to conforming opinions (Flaxman, Goel,
& Rao, 2016). A study of 10.1 million Facebook users found strong liberal versus
conservative homophily, particularly for liberals (Bakshy, Messing, & Adamic,
2015). In fact, individuals chose to view less cross-cutting content than what was
available to them. However, the same study also found cross-ideological friend-
ships: of those who reported their ideology, over 20% of an individual’s Facebook
friends were from an opposing ideological affiliation. This large study suggests that
most peoples’ echo chambers are porous. That noted, in a self-report study of 103
participants, individuals engaged less on Facebook if they perceived more diversity
in their networks compared to those who perceived more similarities (Grevet,
Terveen, & Gilbert, 2014). In other words, sharing was more likely to occur in the
context of homophily, when friends exposed the user to like-minded content and the
user was more assured to have a positive audience.
Researchers examining homophily need to take into account the dynamic
(changeable) nature of friendship online (Noel & Nyhan, 2011). Two processes can
make social networks more homogeneous: Users can become more like those in the
existing network, and users can unfriend (drop from their social media network)
individuals who are dissimilar (Noel & Nyhan, 2011). These dynamic processes
increase homophily over time, making cross-sectional studies of the effects of
homophily on social influence potentially biased (Noel & Nyhan, 2011). Even with
that caveat, homophily is an established phenomenon that impacts the information
that social media users observe (Bakshy et al., 2015; Flaxman et al., 2016).
In addition, Mills’s (2012) SPIN theory of sharing identified size of network as a
potential factor of propagativity, or the ease with which users can share information.
This theory proposed that the bigger the network, the more viral content may show
up on a user’s social networking sites because of an increased chance of exposure.
Thus, individual exposure to a greater array of, and more viral, content, is in part a
function of the user’s network size.
Individuals, of course, are not simply passive receivers of information: They also
seek it out (Fisher & Julien, 2009; Kuhlthau, Heinström, & Todd, 2008). Social
media users may actively seek out news sources, science articles, or other content
12 2 Sources of Messages
directly and share those links on social media (analogous to journalists’ gathering
of information, Paletz, 2002). Information search can be a multi-stage process in its
own right, including affective and cognitive responses (Kuhlthau et al., 2008).
Conducting information search for a class project, for example, is a knowledge
construction task that entails an initial increase and then subsequent decrease in
uncertainty and anxiety (Kuhlthau et al., 2008). Less structured information search
processes could also occur in the context of sharing information on social media.1
The phenomenon of homophily can also affect what individuals seek out, as indi-
viduals gather and check news from sources they trust and follow like-minded friends
and family on social media (termed selective exposure; Paletz, Koon, Whitehead, &
Hagens, 1972; see also Chap. 3.2.1). Guess, Nyhan, and Reifler (2018), using a
nationally representative survey matched with web traffic data (N = 2,525), found
selective exposure mainly by Trump supporters and/or older adults (60+) of fake
news during the October to November period in the 2016 U.S. presidential cam-
paign. This seeking out of fake news sites seemed to occur via Facebook as a signifi-
cant conduit, and predominantly involved Trump supporters seeking out pro-Trump
fake news sites.
Flaxman et al. (2016) defined four channels through which news stories can be
discovered: direct, aggregator, social, and search. Direct discovery means a user has
gone directly to a news domain (such as nytimes.com); the aggregator channel
refers to platforms like Google News or Apple News where users are presented with
a set of links of related news topics hosted on other sites; social involves the use of
a social media platform, like Facebook, Twitter or an e-mail service; and search
involves the use of a web query on a search engine like Google, Bing or Yahoo. In a
study that analyzed web-browsing records using a data set of 2.3 billion distinct
page views, Flaxman et al. (2016) found that much of news consumption comes
from individuals simply visiting the homepage for their preferred news outlet,
which tended to be mainstream media sources. Their findings echoed other studies
(e.g., Bakshy et al., 2015) that the use of social networks and search engines are
associated with the highest levels of segregation by ideology. Thus, although social
media users often obtain their news directly from news sites rather than from their
social contacts, the users remain ideologically isolated (Flaxman et al., 2016). Given
that news sources themselves tend to link to similar perspectives in the stories they
post (Turetsky & Riddle, 2017), this finding means that active gathering does not
remove the possibility of echo chambers.
That noted, selective exposure and homophily do not always lead to blind accep-
tance of information. In an interview study of 58 people who had just chosen to
watch a film on the Vietnam War in a movie theater, 28 disliked it, with 9 consider-
ing it propaganda and 7 saying they had become more sympathetic to America’s war
in Vietnam because they had viewed the film (Paletz et al., 1972). The researchers
determined qualitatively that the American audience was annoyed and frustrated by
1
There is a significant difference between our model and the Information Search Process model
(Kuhlthau et al., 2008): Our cognitive and affective factors are reactions to information rather than
reactions to the process of going through information search.
2.4 Summary of Sources of Messages 13
the foreign film’s unconventional structure and style, as well as its hectoring tone.
This study illustrates that even with homophily, reactions to source material cannot
be taken for granted.
Social media users may create content, observe a message or narrative, or seek one
out. What people view is influenced by what they search for and what is being shared
already in their social networks. Once they receive, create, or find information, gen-
uine actors then have some kind of reaction, be it affective, cognitive, or both.
Chapter 3
Reactions to the Message and Messenger
Abstract Genuine actors will react to the message they read and the messenger
they encounter. We categorized these reactions into those related to affect and
engagement (e.g., high arousal-specific emotions such as surprise or disgust) and
those related to cognitive factors that influence belief, including factors that prompt
individuals to engage in heuristic thinking. These affective and cognitive factors
often interact in complex ways. We also categorized entertainment, humor, and
intellectual engagement as inherently both related to affect and cognition. Believing
in the content of the message is not necessary for individuals to share it online.
Genuine actors react to the messages and messengers they encounter. Psychology,
particularly the fields of social influence, persuasion, and decision making, has gen-
erated many findings regarding how human beings process and judge messages
(Brock & Green, 2005). We divide this section into primarily affective and cognitive
reactions with the caveat that for at least two decades, researchers have increasingly
understood how these two can impact each other and are interconnected (e.g.,
Loewenstein & Lerner, 2003; Sharot, Delgado, & Phelps, 2004). The factors
described in the affect and cognitive sections will influence each other, often in
subtle ways, even if they were often studied in isolation. In particular, affect is
related to ideology and belief in complex ways (Papacharissi, 2017). For instance,
beliefs are more likely to be changed if the piece of news is positive rather than
negative (Sharot & Garrett, 2016). In both the United States and the Netherlands,
sensitivity to feelings of disgust is associated with conservativism (e.g., Brenner &
Inbar, 2015; Inbar, Pizarro, Iyer, & Haidt, 2012). Experiencing a threat may make
liberals shift to conservativism, and a recent study suggests that inducing feelings of
safety and security may shift some conservatives to more socially progressive
stances (Napier, Huang, Vonasch, & Bargh, 2017). Given that political ideology can
High Arousal
Belief in Original
Emotions
Message
Other Affective
Belief Unnecessary
Engagement
influence what types of arguments may be persuasive (Jost & Krochik, 2014), affect
and cognition are not truly separate in this domain. We therefore also discuss factors
that are inherently and conceptually both cognitive and affective (e.g., intellectual
engagement).
The affect section is further divided into high arousal emotions and other affec-
tive engagement, and the cognition section is divided into factors that encourage
belief and attitudes, and those cognitive reactions unrelated to belief (or such that
belief is unnecessary; see Fig. 3.1).
Past research has suggested that emotion is a factor in sharing behavior (e.g., Berger
& Milkman, 2012; Hasell & Weeks, 2016; Peters, Kashima, & Clark, 2009). High
arousal emotions are those that feel more intense and may entail greater physiologi-
cal arousal. A high-arousal, positive emotion might be intense joy; fury is a high-
arousal, negative emotion. Contentment is a low-arousal, positive emotion, and
light sadness is a low-arousal, negative emotion. In general, stories and articles that
elicit emotions are more likely to prompt an intention to share than stories and
articles that do not elicit emotions. Theory and research on rumors, a type of actively
shared narrative, suggest that they tend to fall into four types: those that express fear
of a negative outcome, those that express hope for a positive outcome, those that
express hostility toward a group of people, and those that express curiosity about
intellectually puzzling rumors (Bordia & DiFonzo, 2007; Silverman, 2015). Three
of these types involve strong, arousing emotions, and the fourth is an intellectually
engaging state. Rumors are thought to spread due to a combination of threat man-
agement and sensemaking, and high arousal rumors are more likely to be shared
(Silverman, 2015).
In researching the valence of hundreds of New York Times articles that made the
most emailed list, Berger and Milkman (2012) found that positive content was
shared more than non-positive content. The researchers also found that content with
high arousal emotions like anger and anxiety boosted the likelihood that something
might be shared, regardless of a positive or negative valence.
Research on non-digital information-sharing behavior suggests that emotions
play a significant role in sharing and participants’ willingness to pass along infor-
mation offline, as well (Berger, 2011; Peters et al., 2009). In particular, the level of
arousal of an emotion may make a difference. In a non-digital study of intention to
share information, emotionality had a significant effect on participants’ willingness
to pass along anecdotes (N = 160; Peters et al., 2009). Anecdotes that had high or
medium emotionality were more likely to be shared than anecdotes with low emo-
tionality, a finding that the authors replicated with high versus low emotionality in a
smaller sample that actually shared the anecdotes (N = 40). In addition, anecdotes
that aroused interest, surprise, disgust and happiness were rated as being likely to be
shared, whereas anecdotes with fear, contempt, or sadness were not significantly
likely to be shared. In their second, smaller study, Peters et al. (2009) did not find
any differences for the specific emotions of happiness, sadness, or disgust, control-
ling for emotionality. In a separate study, arousal, but not positive or negative
valence, had a significant effect on intention to share news stories (N = 93; Berger,
2011). In that study, low arousal emotions were contentment or sadness, whereas
high arousal emotions were amusement or anxiety.
In an example of how content triggering high-arousal emotions may impact shar-
ing, a false story from the English corpus about Australia becoming the “most
microchipped nation” was shared multiple times on Facebook (Appendix A, Table
A1, story #433). In a public post on the user’s personal profile in 2017, one Facebook
user shared a link to a version of the story on YouTube. The user added the
18 3 Reactions to the Message and Messenger
Fig. 3.2 Facebook posts about fake story of Australia microchipping citizens
c ommentary, “Do these people realise what they are actually doing to there body’s
[sic].this.is soooooo worrying …” (see Fig. 3.2, post on left).
Another Facebook user posted a link to a version of the story on a site called sur-
vivaldan101.com. The Facebook user added commentary by writing, “And Apocalypse
is already upon them ♥ Excellent job!! Everybody will see what will happen to them
for accepting to be slaves of satan!! Stupids [sic] miserable animals!! They are not
people any more!!” (see Fig. 3.2, post on right). These comments illustrate the find-
ings by Peters et al. (2009): The information presented in this article could arguably
arouse interest, surprise and disgust in the user. By evaluating the commentary added
by Facebook users, we observed that this story invoked worry and anger.
There has also been extensive research on online sharing behavior and inten-
tions. Students were more likely to email an article if they had been jogging lightly
in place, a task that increases physiological arousal (Berger, 2011). Although the
sample was small (N = 40), the effect was large, with 75% of those who jogged
versus 35% of those who did not sharing the article. This study suggests that it is the
interpretation of the feelings of physiological arousal that encourages sharing,
rather than the specific emotions that may accompany the physiological arousal. A
separate study examined the motivations and behaviors of sharing by using semi-
structured interviews with 40 consumers born between 1978 and 1994 (Generation
Y, and mostly university students) who forwarded video content (viral messages) in
their online networks (Botha & Reyneke, 2013). The researchers showed both
content-specific (applicable to a certain group) and general videos to participants in
order to understand their emotional responses and to gauge whether or not the par-
ticipants would share the content. The general video, which was described with the
words “funny” or “laugh” was more likely to be shared; however, participants who
did not find it funny did not feel compelled to pass it along. In addition, the more
familiar the participants were with the content, the more likely they were to have an
emotional reaction and to pass it on. Valence in this study also proved important: If
participants had a positive emotional reaction to the video, they were more likely to
pass it on, but if they had a negative (or no) emotional reaction to the video, they
were not likely to share the video. However, the negative emotions captured in this
study were disinterest or boredom, rather than high-arousal negative emotions like
anger. Thus, although this study highlights the potential importance of positive
affect, it also confounded arousal and affect valence.
3.1 Affect and Engagement 19
Intense emotions are not the only type of affect related to sharing information and
narratives. Another affective reaction involves positive valence and may entail mod-
erate physiological arousal: the feeling one has when presented with something
cute. Information, narratives, and pictures may get shared on the internet simply
because they are cute (Abad-Santos, 2017; Nittono, Fukushima, Yano, Moriya, &
Paterson, 2012). Some scholars suggest that cute things are popular because they
produce positive feelings (Nittono et al., 2012). Here, we are making a distinction
between cute as adorable and cute as sexy or pretty, with the focus on the former.
Creatures are identified as cute when they have infantile (babyish) features such as
large eyes and large foreheads, heads large relative to bodies, and round cheeks
(Alley, 1981; Glocker et al., 2009). Humans are thought to instinctively respond
with caregiving desires to cute creatures (Glocker et al., 2009), even if those crea-
tures aren’t baby humans (Golle, Lisibach, Mast, & Lobmaier, 2013). In an attempt
20 3 Reactions to the Message and Messenger
3.2 Cognition
and Hagens (1972) study had a negative emotional reaction to the movie’s tone and
style, which influenced how persuasive they found its message. This section focuses
on summarizing the sizable literature on factors that influence belief in an original
message, which has been assumed to promote sharing online (e.g., Broniatowski,
Hilyard, & Dredze, 2016). However, belief is not always necessary for why indi-
viduals share narratives online. In a second subsection, we touch on those situations,
which we observed in our qualitative research and anecdotally.
There are several factors that influence readers to believe the content they read (see
Table 3.1). These factors involve source and message credibility; confirmation and
related biases; message availability, accessibility, and fluency; framing; deliberate
persuasive techniques; and individual differences of the target audience, or in this
case, potential sharer. Two theories of persuasion—the elaboration likelihood model
and the heuristic-systematic model—converged in their identification of two sepa-
rate modes of persuasion and processing social information: heuristic (or periph-
eral) and systematic modes (or central; e.g., Chaiken & Maheswaran, 1994; Chen &
Chaiken, 1999; Petty & Cacioppo, 1986; Petty, Cacioppo, Strathman, & Priester,
2005). Whereas systematic or central processing involves logically, attentively, con-
sciously, and effortfully weighing the pros and cons of information, heuristic (or
peripheral) processing entails using fast, simple heuristics and cues. Heuristic pro-
cessing can only occur if the relevant heuristics are both available in memory and
relevant (accessible) to the situation, with frequent use likely resulting in chronic
readiness of the heuristic to be used (Chen & Chaiken, 1999). The scientific litera-
ture on cognition identifies these two modes as System 1 (heuristic) and System 2
(systematic). These modes have also been argued to be fundamentally different
types of reasoning and memory in general, even beyond the processing of social
information, and to have biological bases (e.g., Evans, 2003; Smith & DeCoster,
2000). Many, but not all, of the factors listed below involve activating one of these
types of information processing. Although many of these factors involve heuristic
processing, it would be inaccurate to assume that relying solely on systematic pro-
cessing would inure people from spreading fake news. Many of the factors below
entail both types of processing, and systematic processing is less helpful when the
information available is incorrect or misrepresented.
Source and Message Credibility
The first set of factors, which generally but not exclusively involves heuristic think-
ing, entails credibility and attractiveness of the source, as well as message credibil-
ity (e.g., Heesacker, Petty, & Cacioppo, 1983; Pornpitakpan, 2004; Schwarz, Sanna,
Skurnik, & Yoon, 2007; Swire, Berinsky, Lewandowsky, & Ecker, 2017). For exam-
ple, one of the inaccurate news stories in our qualitative analysis was titled “NPR:
25 Million Votes For Clinton ‘Completely Fake’ – She Lost Popular Vote” (see
Appendix A, Table A1, story #106 for more detail). National Public Radio (NPR) is
commonly considered a high-standard and generally nonpartisan news source,
though the American right considered it leftist and biased in favor of Clinton, or at
least non-independent (Langlois, 2016). In this context, NPR may have been cited
to either identify the source as credible or to imply that the story must be true
because it conflicted with the perceived liberal bias of NPR. Facebook users repost-
ing this story explicitly raised the issue of NPR as source credibility for liberals,
such as “NPR? Wow! I think we all knew this to be true” and “My liberal friends
will break their butts saying this is ‘fake News.’”
Both source attractiveness and credibility have consistent effects on persuasive-
ness, making a message more appealing (Petty et al., 2005; Pornpitakpan, 2004).
Physical attractiveness seems to have a positive effect when perceived expertise of
a source is low (Pornpitakpan, 2004). Source credibility is similarly often linked to
message credibility. In a study of 220 students at a journalism school, individuals
judged online information, but not advertisements, based on the perceived credibil-
ity of the web source (New York Times versus a personal home page; Greer, 2003).
Quick cues and heuristics may indicate credibility even when the cues are not indic-
ative of accuracy. For example, non-native speakers of English were rated as less
credible than native speakers when stating trivia facts (Lev-Ari & Keysar, 2010).
Although the participants were able to correct this bias when the accents were mild,
they were unable to overcome this bias when the accents were heavy. In another
study, claims were more likely to be believed when the individuals to whom they
were attributed had easily pronounced names, even controlling for region of origin
of those names (Newman et al., 2014). Source credibility can be judged by per-
ceived authority (power), trustworthiness, and/or expertise (Pornpitakpan, 2004).
For example, in our Russian corpus, the story on foreign nationals collecting bio-
logical material in Russia had a huge resonance in social media, possibly due to the
fact that it was originated by President Putin and aired on national TV, and thus
considered highly credible (see Appendix A, Table A2, story #1).
Expertise entails whether the source is perceived as making accurate statements
(Pornpitakpan, 2004). A series of studies suggested a small but consistent effect of
having citations in a text on judgments of the truth of claims (Putnam & Phelps,
2017). The citations may serve as a heuristic for perceived correct expertise.
Credibility can also be attributed because of a contrast to nearby information. In a
large study (N = 877), a news story embedded in an impolite partisan blog appeared
Exploring the Variety of Random
Documents with Different Content
Kiitos huolenpidostasi, arvoisa juutalainen. Takaisin — tahdon
nähdä kansalaiset vielä kerran illan hämärässä.
ÄÄNI OIKEALTA.
KASTETTU JUUTALAINEN.
MIES.
TALONPOIKIEN KUORO.
ERÄS ÄÄNI.
TOINEN ÄÄNI.
KOLMAS ÄÄNI.
NELJÄS ÄÄNI.
MIES.
En voinut eroittaa hänen kasvojaan joukon keskeltä.
KASTETTU JUUTALAINEN.
MIES.
(Laskeutuu pensaikkoon.)
*****
KASTETTU JUUTALAINEN.
KASTETTU JUUTALAINEN.
Varovasti, hitaasti!
MIES.
KASTETTU JUUTALAINEN.
MIES.
OHIKULKIJA.
Vapauden nimessä onnittelen teitä molempia.
TOINEN.
KOLMAS.
KASTETTU JUUTALAINEN.
MIES.
KASTETTU JUUTALAINEN.
MIES.
KASTETTU JUUTALAINEN.
En näe häntä täällä.
LEONARD.
NEIDON ÄÄNI.
TOINEN NAIS-ÄÄNI.
KOLMAS NAIS-ÄÄNI.
MIES.
KASTETTU JUUTALAINEN.
MIES.
LEONARD.
NAISTEN KUORO.
LEONARD.
MIES.
ÄÄNIÄ SEKAISIN.
PAPPIEN KUORO.
MURHAMIES.
TOINEN.
KOLMAS.
Minä kuningas Emanuelin.
LEONARD.
MURHAMIESTEN KUORO.
LEONARD.
Herää, sulottareni.
(Kuuluu ukkonen.)
NEITO.
MIES.
Joku on juossut hänen luokseen, vaipunut polvilleen, ponnistaa
voimiaan, sopertaa jotain ja voihkii.
KASTETTU JUUTALAINEN.
LEONARD.
HERMAN.
LEONARD (uhripapeille).
(Hermanille.)
MIES.
KASTETTU JUUTALAINEN.
Väistykäämme tieltä.
MIES.
KASTETTU JUUTALAINEN.
(Miehelle.)
MIES.
KASTETTU JUUTALAINEN.
MIES.
MIES.
LEONARD.
MIES.
KASTETTU JUUTALAINEN.
Voi Abraham!
MIES.
KASTETTU JUUTALAINEN.
Me olemme hukassa.
Mikä mies sinä olet, veli, jolla on kasvot noin korskeina? Miksi et
liity seuraamme?
MIES.
LEONARD.
MIES.
LEONARD.
MIES.
LEONARD.
Kuka sinulla sitten on mielessä?
MIES.
LEONARD.
VÄKIJOUKON ÄÄNI.
LEONARD.
PAPPIEN KUORO.
(Menevät ohi.)
FILOSOFIEN KUORO.
(Menevät ohi.)
FILOSOFIN POIKA.
TYTTÖ (tanssien).
TOINEN.
LAPSET.
TOISET.
TAITEILIJAIN KUORO.
Tähän goottilaisen temppelin raunioille me rakennamme uuden
temppelin. Siinä ei saa olla kuvia eikä kuvapatsaita — holvit pitkistä
tikareista, pylväät kahdeksasta ihmisen päästä, ja kunkin pylvään
huippu kuin hiukset, joista veri tihkuu. Alttari yksin olkoon valkea ja
vain yksi merkki sen päällä: vapauden lakki — hurraa!
TOISET.
KASTETTU JUUTALAINEN.
MIES.
ÄÄNI ILMASTA.
MIES.
Kiitos neuvostasi. — Kostoa isieni häväistystä tomusta, kirous
uusille
sukupolville. Niiden pyörre ympäröi minut, vaan ei vie mukaansa. —
Kotka, kotka, pidä lupauksesi. — Mutta nyt seuraa minua laaksoon
pyhän
Ignatiuksen linnahautaan.
KASTETTU JUUTALAINEN.
MIES.
KASTETTU JUUTALAINEN.
MIES.
KASTETTU JUUTALAINEN.
MIES.
HENKIEN KUORO.
MIES.
KASTETTU JUUTALAINEN.
MIES.
KASTETTU JUUTALAINEN.
MIES.
ÄÄNI PENSAIKOSTA.
MIES.
*****
PANKRATIUS (väelleen).
LEONARD.
PANKRATIUS.
LEONARD.
*****
MIES.
Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.
textbookfull.com