Disaster of Misinformation
Disaster of Misinformation
Disaster of Misinformation
https://doi.org/10.1007/s41060-022-00311-6
REVIEW
Received: 31 May 2021 / Accepted: 6 January 2022 / Published online: 15 February 2022
© The Author(s), under exclusive licence to Springer Nature Switzerland AG 2022
Abstract
The spread of misinformation in social media has become a severe threat to public interests. For example, several incidents
of public health concerns arose out of social media misinformation during the COVID-19 pandemic. Against the backdrop of
the emerging IS research focus on social media and the impact of misinformation during recent events such as the COVID-19,
Australian Bushfire, and the USA elections, we identified disaster, health, and politics as specific domains for a research
review on social media misinformation. Following a systematic review process, we chose 28 articles, relevant to the three
themes, for synthesis. We discuss the characteristics of misinformation in the three domains, the methodologies that have
been used by researchers, and the theories used to study misinformation. We adapt an Antecedents-Misinformation-Outcomes
(AMIO) framework for integrating key concepts from prior studies. Based on the AMIO framework, we further discuss the
inter-relationships of concepts and the strategies to control the spread of misinformation on social media. Ours is one of the
early reviews focusing on social media misinformation research, particularly on three socially sensitive domains; disaster,
health, and politics. This review contributes to the emerging body of knowledge in Data Science and social media and informs
strategies to combat social media misinformation.
123
272 International Journal of Data Science and Analytics (2022) 13:271–285
on social media platforms. As a result of the targeting, it 2 Social media and spread of misinformation
eventually translated into physical violence and discrimina-
tory treatment against members of the community in some Misinformation arises in uncertain contexts when people are
of the Indian states [7].’Rumors’ and ’fake news’ are simi- confronted with a scarcity of information they need. During
lar terms related to misinformation. ’Rumors’ are unverified unforeseen circumstances, the affected individual or com-
information or statements circulated with uncertainty, and munity experiences nervousness or anxiety. Anxiety is one
’fake news’ is the misinformation that is distributed in an offi- of the primary reasons behind the spread of misinformation.
cial news format. Source ambiguity, personal involvement, To overcome this tension, people tend to gather information
confirmation bias, and social ties are some of the rumor- from sources such as mainstream media and official gov-
causing factors. Yet another related term, mal-information, ernment social media handles to verify the information they
is accurate information that is used in different contexts to have received. When they fail to receive information from
spread hatred or abuse of a person or a particular group. official sources, they collect related information from their
Our review focuses on misinformation that is spread through peer circles or other informal sources, which would help them
social media platforms. The words ’rumor’, and ’misinfor- to control social tension [9]. Furthermore, in an emergency
mation’ are used interchangeably in this paper. Further, we context, misinformation helps community members to reach
identify factors that cause misinformation based on a sys- a common understanding of the uncertain situation.
tematic review of prior studies.
Ours is one of the early attempts to review social media 2.1 The echo chamber of social media
research on misinformation. This review focuses on three
sensitive domains of disaster, health, and politics, setting Social media has increasingly grown in power and influence
three objectives: (a) to analyze previous studies to under- and has acted as a medium to accelerate sociopolitical move-
stand the impact of misinformation on the three domains ments. Network effects enhance participation in social media
(b) to identify theoretical perspectives used to examine the platforms which in turn spread information (good or bad) at a
spread of misinformation on social media and (c) to develop a faster pace compared to traditional media. Furthermore, due
framework to study key concepts and their inter-relationships to a massive surge in online content consumption primarily
emerging from prior studies. We identified these specific through social media both business organizations and politi-
areas as the impact of misinformation with regards to both cal parties have begun to share content that are ambiguous or
speed of spread and scale of influence are high and detri- fake to influence online users and their decisions for finan-
mental to the public and governments. To the best of our cial and political gains [9, 10]. On the other hand, people
knowledge, the review of the literature on social media often approach social media with a hedonic mindset, which
misinformation themes are relatively scanty. This review reduces their tendency to verify the information they receive
contributes to an emerging body of knowledge in Data [9]. Repetitive exposure to contents that coincides with their
Science and informs the efforts to combat social media mis- pre-existing beliefs, increases believability and shareability
information. Data Science is an interdisciplinary area which of content. This process known as the echo-chamber effect
incorporates different areas like statistics, management, and [11] is fueled by confirmation bias. Confirmation bias is the
sociology to study the data and create knowledge out of data tendency of the person to support information that reinforces
[8]. This review will also inform future studies that aim to pre-existing beliefs and neglect opposing perspectives and
evaluate and compare patterns of misinformation on sensi- viewpoints other than their own.
tive themes of social relevance, such as disaster, health, and Platforms’ structure and algorithms also have an essen-
politics. tial role in spreading misinformation. Tiwana et al. [12] have
The paper is structured as follows. The first section intro- defined platform architecture as ‘a conceptual blueprint that
duces misinformation in social media context. In Sect. 2, describes how the ecosystem is partitioned into a relatively
we provide a brief overview of prior research works on stable platform and a complementary set of modules that are
misinformation and social media. Section 3 describes the encouraged to vary, and the design rules binding on both’.
research methodology, which includes details of the litera- Business models of these platforms are based upon maximiz-
ture search and selection process. Section 4 discusses the ing user engagement. For example, in the case of Facebook
analysis of spread of misinformation on social media based or Twitter, user feed is based on their existing belief or pref-
on three themes- disaster, health, and politics and the review erences. User feeds provide users with similar content that
findings. This includes current state of research, theoreti- matches their existing beliefs, thus contributing to the echo
cal foundations, determinants of misinformation in social chamber effect.
media platforms, and strategies to control the spread of mis- Platform architecture makes the transmission and retrans-
information. Section 5 concludes with the implications and mission of misinformation easier [12, 13]. For instance,
limitations of the paper. WhatsApp has a one-touch forward option that enables users
123
International Journal of Data Science and Analytics (2022) 13:271–285 273
123
274 International Journal of Data Science and Analytics (2022) 13:271–285
Fig. 1 Articles published on misinformation during 2005–2021 (Databases; Scopus, Springer, and EBSCO)
3.1 Inclusion–exclusion criteria research methods; there were 11 studies that used experimen-
tal methods, and eight used Twitter data analyses. Apart from
Figure 2 depicts the systematic review process followed in these, there were three survey methods, two mixed methods,
this study. In our preliminary search, 2148 records were and case study methods each, and one opportunistic sampling
retrieved from databases—all those articles were gathered and exploratory study each. The selected literature for review
onto a spreadsheet, which was manually cross-checked with includes nine articles on disaster, eight on healthcare, and
the journals linked to the articles. Studies published during eleven from politics. We preferred papers for review based
2005–2021, studies published in English language, articles on three major social media platforms; Twitter, Facebook,
published from peer-reviewed journals, journals rating and and WhatsApp. These are the three social media owners with
papers relevant to misinformation were used as the inclusion the highest transmission rates and most active users [25] and
criteria. We have excluded reviews, thesis, dissertations, and most likely platforms for misinformation propagation.
editorials; and articles on misinformation that are not akin to
social media. To fetch the best from these articles, we selected
3.2 Coding procedure
articles that were from top journals, rated above three accord-
ing to ABS rating and A*, A, and B according to ABDC
Initially both the authors have manually coded the articles
rating. This process, while ensuring the quality of papers,
individually by reading full text of each article and then
also effectively shortened purview of study to 643 articles of
identified the three themes; disaster, health, and politics. We
acceptable quality. We have not performed track-back and
used an inductive coding approach to derive codes from the
track-forward on references. During this process, duplicate
data. The intercoder reliability rate between the authors were
records were also identified and removed. Further screening
82.1%. Disagreement among authors related to deciding in
of articles based on the title, abstract, and full text (wherever
which theme few papers fall under were discussed and a reso-
necessary)—brought down the number to 207 articles.
lution was arrived at. Later we used NVIVO, a qualitative data
Further screening based on the three themes reduced the
analysis software, to analyze unstructured data to encode and
focus to 89 articles. We conducted a full-text analysis of these
categorize the themes from the articles. The codes emerged
89 articles. We further excluded articles that had not consid-
from the articles were categorized into sub-themes and later
ered misinformation as a central theme and finally arrived at
attached to the main themes; disaster, health, and politics.
28 articles for detailed review (Table 1).
NVIVO produced a rank list of codes based on frequency
The selected studies used a variety of research methods to
of occurrence (“Appendix”). An intercoder reliability check
examine the misinformation on social media. Experimenta-
was completed for the data by an external research scholar
tion and text mining of tweets emerged as the most frequent
having a different areas of expertise to ensure reliability. The
123
International Journal of Data Science and Analytics (2022) 13:271–285 275
coder agreed upon 26 articles out of 28 (92.8%), which indi- mitigation of these effects can also demand substantial finan-
cated a high level intercoder reliability [49]. The independent cial or human resources burden considering the scale of effect
researcher’s disagreement about the code for two authors was and risk of spreading negative information to the public
discussed between the authors and the research scholar and altogether. All these areas are sensitive in nature. Further,
a consensus was arrived at. disaster, health, and politics have gained the attention of
researchers and governments as the challenges of misin-
formation confronting these domains are rampant. Besides
4 Results sensitivity, misinformation in these areas has higher poten-
tial to exacerbate the existing crisis in society. During the
We initially reviewed articles separately from the categories 2020 Munich security conference, WHO’s Director-General
of disaster, health, and politics. We first provide emergent noted: “We are not just fighting an epidemic; we are fighting
issues that cut across these themes. an infodemic”, referring to the faster spread of COVID-19
misinformation than the virus [50].
4.1 Social media misinformation research More than 6000 people were hospitalized due to COVID-
19 related misinformation in the first three months of 2020
Disaster, health, and politics emerged as the three domains [51]. As COVID-19 vaccination began, one of the popular
(“Appendix”) where misinformation can cause severe harm, myths was that Bill Gates wanted to use vaccines to embed
often leading to casualties or even irreversible effects. The microchips in people to track them and this created vaccine
123
276
123
Table 1 Reviewed articles
Sl. no. Year Author Theory Theme Platform and method
hesitancy among the citizens [52]. These reports show the official government communication channels or mainstream
severity of the spread of misinformation and how misinfor- media cannot fulfill citizen’s needs, they resort to information
mation can aggravate a public health crisis. from their social media peers [9, 27, 29].
In a social crisis context, Tamotsu Shibutani [56] defines
4.2 Misinformation during disaster rumoring as collective sharing and exchange of information,
which helps the community members to reach a common
In the context of emergency situations (unforeseen circum- understanding about the crisis situation [30]. This mecha-
stances), the credibility of social media information has often nism works in social media, which creates information dearth
been questioned [11]. When a crisis occurs, affected commu- and information overload. Anxiety, information ambiguity
nities often experience a lack of localized information needed (source ambiguity and content ambiguity), personal involve-
for them to make emergency decisions. This accelerates the ment, and social ties are the rumor-causing variables in a
spread of misinformation as people tend to fill this informa- crisis context [9, 27]. In general, anxiety is a negative feel-
tion gap with misinformation or ’improvised news’ [9, 24, ing caused by distress or stressful situation, which fabricates
25]. The broadcasting power of social media and re-sharing or produces adverse outcomes [57]. In the context of a cri-
of misinformation could weaken and slow down rescue oper- sis or emergency, a community may experience anxiety in
ations [24, 25]. As the local people have more access to the the absence of reliable information or in other cases when
disaster area, they become immediate reporters of a crisis confronted with overload of information, making it diffi-
through social media. Mainstream media comes into picture cult to take appropriate decisions. Under such circumstances,
only later. However, recent incidents reveals that voluntary people may tend to rely on rumors as a primary source of
reporting of this kind has begun to affect rescue operations information. The influence level of anxiety is higher dur-
negatively as it often acts as a collective rumor mill [9], which ing a community crisis than during a business crisis [9].
propagates misinformation. During the 2018 floods in the However, anxiety, as an attribute, varies based on the nature
South-Indian state of Kerala a fake video on Mullaperiyar of platforms. For example, Oh et al. [9] found that the
Dam leakage created unnecessary panic among the citizens, Twitter community do not fall into social pressure as like
thus negatively impacting the rescue operations [53]. Infor- WhatsApp community [30]. Simon et al. [30] developed a
mation from mainstream media is relatively more reliable model of rumor retransmission on social media and identified
as they have traditional gatekeepers such as peer reviewers information ambiguity, anxiety and personal involvement
and editors who cross-check the information source before as motives for rumormongering. Attractiveness is another
publication. Chua et al. [28] found that a major chunk of cor- rumor-causing variable. It occurs when aesthetically appeal-
rective tweets were retweeted from mainstream news media, ing visual aids or designs capture a receiver’s attention. Here
thus mainstream media is considered as a preferred rumor believability matters more than the content’s reliability or the
correction channel, where they attempt to correct misinfor- truth of the information received.
mation with the right information. The second stage of the spread of misinformation is mis-
information retransmission. Apart from the rumor-causing
4.2.1 Characterizing disaster misinformation variables that are reported in Oh et al. [9], Liu et al. [13] found
senders credibility and attractiveness as significant variables
Oh et al. [9] studied citizen-driven information processing related to misinformation retransmission. Personal involve-
based on three social crises using rumor theory. The main ment and content ambiguity can also affect misinformation
characteristic of a crisis is the complexity of information transmission [13]. Abdullah et al. [25] explored retweeter’s
processing and sharing [9, 24]. A task is considered complex motive on the Twitter platform to spread disaster information.
when characterized by increase in information load, informa- Content relevance, early information [27, 31], trustworthi-
tion diversity or rate of information change [54]. Information ness of the content, emotional influence [30], retweet count,
overload and information dearth are the two grave concerns pro-social behavior (altruistic behavior among the citizens
that interrupt the communication between the affected com- during the crisis), and the need to inform their circle are the
munity and a rescue team. Information overload, where too factors that drive users’ retweet [25]. Lee et al. [26] have also
many enquiries and fake news distract a response team, slows examined the impact of Twitter features on message diffu-
them down to recognize valid information [9, 27]. Accord- sion based on the 2013 Boston marathon tragedy. The study
ing to Balan and Mathew [55] information overload occurs reported that during crisis events (especially during disas-
when volume of information such as complexity of words and ters), a tweet that has less reaction time (time between the
multiple languages that exceeds and cannot be processed by crisis and initial tweet) and had higher impact than other
a human being. Here information dearth in our context is tweets. This shows that to an extent, misinformation can be
the lack of localized information that is supposed to help the controlled if officials could communicate at the early stage of
affected community to make emergency decisions. When the a crisis [27]. Liu et al. [13] showed that tweets with hashtags
123
278 International Journal of Data Science and Analytics (2022) 13:271–285
influence spread of misinformation. Further, Lee et al. [26] information could be in a different context and may not be
found that tweets with no hashtags had more influence due even accurate [33, 34]. Compared to health-related websites,
to contextual differences. For instance, usage of hashtags for the language used to detail the health information shared
marketing or advertising has a positive impact, while in the on social media will be simple and may not include essen-
case of disaster or emergency situations, usage of hashtags tial details [35, 37]. Some studies reported that conspiracy
(as in case of Twitter) has a negative impact. Messages with theories and pseudoscience have escalated casualties [33].
no hashtag get widely diffused when compared to messages Pseudoscience is the term referred to as the false claim, which
with the hashtag [26]. pretends as if the shared misinformation has scientific evi-
Oh et al. [15] explored the behavioral aspects of social dence. The anti-vaccination movement on Twitter is one of
media participants that led to retransmission and spread the examples of pseudoscience [61]. Here the user might have
of misinformation. They found that when people believe shared the information due to the lack of scientific knowledge
a threatening piece of misinformation they received, they [35].
are more likely to spread it, and they take necessary safety
measures (sometimes even extreme actions). Repetition of 4.3.1 Characterizing healthcare misinformation
the same misinformation from different sources also makes
it more believable [28]. However, when they realize the The attributes that characterize healthcare misinformation
received information was false they were less likely to share are distinctly different from other domains. Chua and Baner-
it with others [13, 26]. The characteristics of the platform jee, [37] identified the characteristics of health misinforma-
used to deliver the misinformation also matters. For instance, tion as dread and wish. Dread is the rumor which creates more
numbers of likes and shares of the information increases the panic and unpleasant consequences. For example, in the wake
believability of the social media post [47]. of COVID-19, misinformation was widely shared on social
In summary, we found that platform architecture also has media, which claimed that children ’died on the spot’ after
an essential role in spreading and believability of misinfor- the mass COVID-19 vaccination program in Senegal, West
mation. While conducting this systematic literature review, Africa [61]. This message created panic among the citizens,
we observed that more studies on disaster and misinformation as the misinformation was shared more than 7000 times on
are based on the Twitter platform. The six papers out of nine Facebook [61]. Wish is the type of rumor that gives hope to
that we reviewed on disaster area were based on the Twitter the receiver (e.g.,: rumor on free medicine distribution) [62].
platform. When a message was delivered in video format, Dread rumor looks more trustworthy and more likely to get
it had a higher impact compared to audio or text messages. viral. Dread rumor was the cause of violence against a minor-
If the message had a religious or cultural narrative, it led ity group in India during COVID-19 [7]. Chua and Banerjee,
to behavioral action (danger control response) [15]. Users [32] added pictorial and textual representations as the char-
were more likely to spread misinformation through What- acteristics of health misinformation. The rumor that contains
sApp than Twitter. It was difficult to find the source of shared only text is textual rumor. Pictorial rumor on the other hand
information on WhatsApp [30]. contains both text and images. However, Chua and Baner-
jee, [32] found that users prefer textual rumor than pictorial.
4.3 Misinformation related to healthcare Unlike rumors that are circulated during a natural disaster,
health misinformation will be long-lasting, and it can spread
From our review, we found two systematic literature reviews cutting across boundaries. Personal involvement (the impor-
that discusses health-related misinformation on social media. tance of information for both sender and receiver), rumor
Yang et al. [58] explores the characteristics, impact and influ- type and presence of counter rumor are some of the vari-
ences of health misinformation on social media. Wang et al. ables that can escalate users’ trusting and sharing behavior
[59] addresses health misinformation related to vaccines and related to rumor [37]. The study of Madraki et al. [46] study
infectious diseases. This review shows that health-related on COVID-19 misinformation /disinformation reported that
misinformation, especially on M.M.R. vaccine and autism COVID-19 misinformation on social media differs signifi-
are largely spreading on social media and the government is cantly based on the languages, countries and their culture
unable to control it. and beliefs. Acceptance of social media platforms as well as
The spread of health misinformation is an emerging issue Governmental censorship also play an important role here.
facing public health authorities. Health misinformation could Widespread misinformation could also change collective
delay proper treatment to patients, which could further add opinion [29]. Online users’ epistemic beliefs could control
more casualties to the public health domain [28, 59, 60]. their sharing decisions. Chua and Banerjee, [32] argued that
Often people tend to believe health-related information that epistemologically naïve users (users who think knowledge
is shared by their peers. Some of them tend to share their can be acquired easily) are the type of users who acceler-
treatment experience or traditional remedies online. This ate the spread of misinformation on platforms. Those who
123
International Journal of Data Science and Analytics (2022) 13:271–285 279
read or share the misinformation are not likely to follow it mation presentation format and reported that social media
[37]. Gu and Hong [34] examined health misinformation on platforms indirectly force users to accept certain information;
mobile social media context. Mobile internet users are dif- they present information such that little importance is given
ferent from large screen users. The mobile phone user might to the source of information. This presentation is manipu-
have a more emotional attachment toward the gadget. It also lative as people tend to believe information from a reputed
motivates them to believe received misinformation. The cor- source and are more likely to reject information that is from
rective effort focused on large screen users may not work a less-known source [42].
with mobile phone users or small screen users. Chua and Pennycook et al. [39], and Garrett and Poulsen [40] argued
Banerjee [32] suggested that simplified sharing options of that warning tags (or flagging) on the headline can reduce
platforms also motivate users to share the received misin- the spread of misinformation. However, it is not practical to
formation before validating it. Shahi et al. [47] found that assign warning tags to all misinformation as it gets generated
misinformation is also propagated or shared even by the ver- faster than valid information. The fact-checking process in
ified Twitter handles. They become a part of misinformation social media also takes time. Hence, people tend to believe
transmission either by creating it or endorsing it by liking or that the headlines which do not have warning tags are true and
sharing the information. the idea of warning tags will thus not serve any purpose [39].
The focus of existing studies is heavily based on data Furthermore, it could increase the reader’s belief in warning
from social networking sites such as Facebook and Twitter, tags and lead to misperception [39]. Readers tend to believe
although other platforms too escalate the spread of misin- that all information is verified and consider untagged false
formation. Such a phenomenon was evident in the wake of information as more accurate. This phenomenon is known as
COVID-19 as an intense trend of misinformation spread was the implied truth effect [39]. In this case, source reputation
reported on WhatsApp, TikTok, and Instagram. rating will influence the credibility of the information. The
reader gives less importance to the source that has a low
4.4 Social media misinformation and politics rating [17, 50].
123
280 International Journal of Data Science and Analytics (2022) 13:271–285
three different rating mechanisms expert rating, user article a family member or social peers will influence the person
rating and user source rating. Reputation theory was used to share the information [9, 13]. From prior literature, it is
to show how users would discern cognitive biases in expert understood that confirmation bias is one of the root causes of
ratings. political misinformation. Research on attractiveness of the
Murungi et al. [38] used rhetorical theory to argue that received information reveals that users tend to believe and
fact-checkers have less effectiveness on fake news that share the information that is received on her or his personal
spreads on social media platforms. The study proposed a dif- device [34]. After receiving the misinformation from vari-
ferent approaches by focusing on underlying belief structure ous sources, users accept it based on their existing beliefs,
that accepts misinformation. The theory was used to iden- and social, cognitive factors and political factors. Oh et al.
tify fake news and socially constructed beliefs in the context [15] observed that during crises, people by default have a
of Alabama’s senatorial election in 2017. Using third per- tendency to believe unverified information especially when
son effect as the theoretical ground, the characteristics of it helps them to make sense of the situation. Misinforma-
rumor corrections on Twitter platform have also been exam- tion has significant effects on individuals and society. Loss
ined in the context of death hoax of Singapore’s first prime of lives [9, 15, 28, 30], economic loss [9, 44], loss of health
minister Lee Kuan Yew [28]. This paper explored the motives [32, 35] and loss of reputation [38, 43] are the major outcome
behind collective rumor and identified the key characteristics of misinformation emerged from our review.
of collective rumor correction. Using situational crisis com-
munication theory (SCCT), Paek and Hove [44] examined 5.2 Strategies for controlling the spread
how government could effectively respond to risk-related of misinformation
rumors during national-level crises in the context of food
safety rumor. Refuting rumor, denying it and attacking the Discourse on social media misinformation mitigation has
source of rumor are the three rumor response strategies sug- resulted in prioritization of strategies such as early communi-
gested by the authors to counter rumor-mongering (Table 2). cation from the officials and use of scientific evidence [9, 35].
When people realize that the received information or mes-
5.1 Determinants of misinformation in social media sage is false, they are less likely to share that information with
platforms others [15]. Other strategies are ’rumor refutation—reducing
citizens’ intention to spread misinformation by real informa-
Figure 3 depicts the concepts that emerged from our tion which reduces their uncertainty and serves to control
review using a framework of Antecedents-Misinformation- misinformation [44]. Rumor correction models for social
Outcomes (AMIO) framework, an approach we adapt from media platforms also employ algorithms and crowdsourcing
Smith HJ et al. [66]. Originally developed to study informa- [28]. Majority of the papers that we have reviewed suggested
tion privacy, the Antecedent-Privacy-Concerns-Outcomes fact-checking by experts, source rating of the received infor-
(APCO) framework provided a nomological canvas to mation, attaching warning tags to the headlines or entire news
present determinants, mediators and outcome variables per- [36], and flagging content by the platform owners [40] as the
taining to information privacy. Following this canvas, we strategies to control the spread of misinformation. Studies
discuss the antecedents of misinformation, mediators of mis- on controlling misinformation in the public health context
information and misinformation outcomes, as they emerged showed that the government could also seek the help of pub-
from prior studies (Fig. 3). lic health professionals to mitigate misinformation [31].
Anxiety, source ambiguity, trustworthiness, content ambi- However, the aforementioned strategies have been criti-
guity, personal involvement, social ties, confirmation bias, cized for several limitations. Most papers mentioned con-
attractiveness, illiteracy, ease of sharing options and device firmation bias as having a significant impact on the misin-
attachment emerged as the variables determining misinfor- formation mitigation strategies, especially in the political
mation in social media. context where people tend to believe the information that
Anxiety is the emotional feeling of the person who sends matches their prior belief. Garrett and Poulsen [40] argued
or receives the information. If the person is anxious about that during an emergency situation, misinformation recipi-
the information received, he or she is more likely to share or ent may not be able to characterize the misinformation as
spread misinformation [9]. Source ambiguity deals with the true or false. Thus, providing alternative explanation or the
origin of the message. When the person is convinced of the real information to the users have more effect than providing
source of information, it increases his trustworthiness and fact-checking report. Studies by Garrett and Poulsen [40],
the person shares it. Content ambiguity addresses the con- and Pennycook et al. [39] reveal a drawback of attaching
tent clarity of the information [9, 13]. Personal involvement warning tags to news headlines. Once the flagging or tag-
denotes how much the information is important for both the ging of the information is introduced, the information with
sender and receiver [9]. Social ties, information shared by the absence of tags will be considered as true or reliable
123
International Journal of Data Science and Analytics (2022) 13:271–285 281
Rumor theory “A collective and collaborative transaction in [9, 13, 15, 37, 43] Disaster, health
which community members offer, evaluate, and
interpret information to reach a common
understanding of uncertain situations, to alleviate
social tension, and to solve collective crisis
problems” [9]
Diffusion theory In IS research diffusion theory has been used to [26] Disaster
discern the adoption of technological innovation.
Diffusion theory involves “the process by which
an innovation is communicated through certain
channels over time among the members of a
social system.”
Reputation theory Reputation is defined as a three-dimensional [36] Politics
construct comprising the types of functional,
social and expressive reputation [36]
Rhetorical theory Rhetorical theory is “a way of framing an [38] Politics
experience or event—an effort to understand and
account for something and the way it functions in
the world” [64]
Third person effect Theory of “third-person effect describes an [28] Disaster
individual’s belief that other people (i.e., the third
person), not oneself, are more susceptible to the
negative persuasion of the media. The individual
is consequently motivated to react out of concern
for others [28]
Situational crisis communication theory (SCCT) SCCT comprise three elements: “(1) the crisis [44] Disaster
situation, (2) crisis response strategies, and (3) a
system for matching the crisis situation and crisis
response strategies. The theory states that
effectiveness of communication strategies is
dependent on characteristics of the crisis
situation.” [65]
123
282 International Journal of Data Science and Analytics (2022) 13:271–285
information. This creates an implied truth effect. Further, it own limitations. Misinformation corrective algorithms are
is also not always practical to evaluate all social media posts. ineffective if not used immediately after the misinforma-
Similarly, Kim and Dennis [36] studied fake news flagging tion has been created. Related hashtags and keywords are
and found that fake news flags did not influence users’ belief. used by researchers to find content shared on social media
However, they created cognitive dissonance and users were platforms to retrieve data. However, it may not be possi-
in search of the truthfulness of the headline. Later in 2017 ble for researchers to cover all the keywords or hashtags
Facebook discontinued the fake news flagging service owing employed by users. Further, algorithms may not decipher
to its limitations [45] content shared in regional languages. Another limitation of
algorithms employed by platforms is that they recommend
and often display content based on user activities and interests
6 Key research gaps and future directions which limits the users access to information from multiple
perspectives, thus reinforcing their existing belief [29]. A
Although, misinformation is a multi-sectoral issue, our sys- reparative measure is to display corrective information as
tematic review observed that interdisciplinary research on ’related stories’ for misinformation. However, Facebook’s
social media misinformation is relatively scarce. ‘Confirma- related stories algorithm only activates when an individual
tion bias’ is one of the most significant behavioral problem clicks on an outside link, which limits the number of people
that motivates the spread of misinformation. However, lack who will see the corrective information through the algo-
of research on it reveals the scope for future interdisciplinary rithm which turns out to be a challenge. Future research could
research across the fields of Data Science, Information Sys- investigate the impact of related stories as a corrective mea-
tems and Psychology in domains such as politics and health sure by analyzing the relation between misinformation and
care. In the disaster context, there is a scope for study on frequency of related stories posted vis a vis real information.
the behavior of a first respondent and an emergency man- Our review also found a scarcity of research on the
ager to understand their information exchange pattern with spread of misinformation on certain social media platforms
the public. Similarly, future researchers could analyze com- while studies being skewed toward a few others. Of the
munication patterns between citizens and frontline workers studies reviewed, 15 articles were concentrated on misin-
in the public health context, which may be useful to design formation spread on Twitter and Facebook. Although, from
counter-misinformation campaigns and awareness interven- recent news reports it is evident that largely misinformation
tions. Since information disorder is a multi-sectoral issue, and disinformation are spread through popular messaging
researchers need to understand misinformation patterns platforms like the ’WhatsApp’, ‘Telegram’, ‘WeChat’, and
among multiple government departments for coordinated ‘Line’, research using data from these platforms are, how-
counter-misinformation intervention. ever, scanty. Especially in the Indian context, the magnitude
There is a further dearth of studies on institutional of problems arising from misinformation through WhatsApp
responses to control misinformation. To fill the gap, future are overwhelming [68]. To address the lacunae of research
studies could concentrate on the analysis of governmental on messaging platforms, we suggest future researchers to
and organizational interventions to control misinformation concentrate on investigating the patterns of misinformation
at the level of policies, regulatory mechanisms, and commu- spreading on platforms like WhatsApp. Moreover, message
nication strategies. For example, in India there is no specific diffusion patterns are unique to each social media platform;
law against misinformation but there are some provisions therefore, it is useful to study the misinformation diffusion
in the Information Technology Act (IT Act) and Disaster patterns on different social media platforms. Future studies
Management Act which can control misinformation and dis- could also address the differential roles, patterns and inten-
information. An example of awareness intervention is an sity of the spread of misinformation on various messaging
initiative named ‘Satyameva Jayate’ launched in Kannur dis- and photo/ video-sharing social networking services.
trict of Kerala, India which focused on sensitizing children at Evident from our review, most research on misinforma-
school to spot misinformation [67]. As noted earlier, within tion is based on Euro-American context and the dominant
the research on Misinformation in the political context, there models proposed for controlling misinformation may have
is a lack of research on strategies adopted by the state to limited applicability to other regions. Moreover, the popular-
counter misinformation. Therefore, building on cases like ity of social media platforms and usage patterns are diverse
’Satyameva Jayate’ would further contribute to knowledge across the globe consequent to cultural differences and polit-
in this area. ical regimes of the region, therefore necessitating researchers
Technology-based strategies adopted by social media to of social media to take cognizance of empirical experiences
control the spread of misinformation emphasize the cor- of ’ left-over’ regions.
rective algorithms, keywords and hashtags as a solution
[32, 37, 43]. However, these corrective measures have their
123
International Journal of Data Science and Analytics (2022) 13:271–285 283
7 Conclusion Appendix
123
284 International Journal of Data Science and Analytics (2022) 13:271–285
References during Boston bombing 2013. Inf. Syst. Front. 17(5), 997–1005
(2015)
1. Thai, M.T., Wu, W., Xiong, H.: Big Data in Complex and Social 27. Mondal, T., Pramanik, P., Bhattacharya, I., Boral, N., Ghosh, S.:
Networks, 1st edn. CRC Press, Boca Raton (2017) Analysis and early detection of rumors in a post disaster scenario.
2. Peters, B.: How Social Media is Changing Law and Policy. Fair Inf. Syst. Front. 20(5), 961–979 (2018)
Observer (2020) 28. Chua, A.Y.K., Cheah, S.-M., Goh, D.H., Lim, E.-P.: Collective
3. Granillo, G.: The Role of Social Media in Social Movements. Port- rumor correction on the death hoax. In: PACIS 2016 Proceedings
land Monthly (2020) (2016)
4. Wu, L., Morstatter, F., Carley, K.M., Liu, H.: Misinformation 29. Bode, L., Vraga, E.K.: In related news, that was wrong: the cor-
in social media: definition, manipulation, and detection. ACM rection of misinformation through related stories functionality in
SIGKDD Explor. 21(1), 80–90 (2019) social media. J. Commun. 65(4), 619–638 (2015)
5. Sam, L.: Mark Zuckerberg: I regret ridiculing fears over Facebook’s 30. Simon, T., Goldberg, A., Leykin, D., Adini, B.: Kidnapping What-
effect on election|technology|the guardian. Theguardian (2017) sApp—rumors during the search and rescue operation of three
6. WEF: Global Risks 2013—World Economic Forum (2013) kidnapped youth. Comput. Hum. Behav. 64, 183–190 (2016)
7. Scroll: Communalisation of Tablighi Jamaat Event. Scroll.in (2020) 31. Ghenai, A., Mejova, Y.: Fake cures: user-centric modeling of health
8. Cao, L.: Data science: a comprehensive overview. ACM Comput. misinformation in social media. In: Proceedings of ACM Human—
Surv. 50(43), 1–42 (2017) Computer Interaction, vol. 2, no. CSCW, pp. 1–20 (2018)
9. Oh, O., Agrawal, M., Rao, H.R.: Community intelligence and social 32. Chua, A.Y.K., Banerjee, S.: To share or not to share: the role of
media services: a rumor theoretic analysis of tweets during social epistemic belief in online health rumors. Int. J. Med. Inf. 108, 36–41
crises. MIS Q. Manag. Inf. Syst. 37(2), 407–426 (2013) (2017)
10. Mukherjee, A., Liu, B., Glance, N.: Spotting fake reviewer groups 33. Kou, Y., Gui, X., Chen, Y., Pine, K.H.: Conspiracy talk on social
in consumer reviews. In: WWW’12—Proceedings of the 21st media: collective sensemaking during a public health crisis. In:
Annual Conference on World Wide Web, pp. 191–200 (2012) Proceedings of ACM Human–Computer Interaction, vol. 1, no.
11. Cerf, V.G.: Information and misinformation on the internet. Com- CSCW, pp. 1–21 (2017)
mun. ACM 60(1), 9–9 (2016) 34. Gu, R., Hong, Y.K.: Addressing health misinformation dissemina-
12. Tiwana, A., Konsynski, B., Bush, A.A.: Platform evolution: coevo- tion on mobile social media. In: ICIS 2019 Proceedings (2019)
lution of platform architecture, governance, and environmental 35. Bode, L., Vraga, E.K.: See something, say something: correction
dynamics. Inf. Syst. Res. 21(4), 675–687 (2010) of global health misinformation on social media. Health Commun.
13. Liu, F., Burton-Jones, A., Xu, D.: Rumors on social media in dis- 33(9), 1131–1140 (2018)
asters: extending transmission to retransmission. PACIS 2014 36. Kim, A., Moravec, P.L., Dennis, A.R.: Combating fake news on
14. Hern, A.: WhatsApp to impose new limit on forwarding to fight social media with source ratings: the effects of user and expert
fake news. The Guardian (2020) reputation ratings. J. Manag. Inf. Syst. 36(3), 931–968 (2019)
15. Oh, O., Gupta, P., Agrawal, M., Raghav Rao, H.: ICT mediated 37. Chua, A.Y.K., Banerjee, S.: Intentions to trust and share online
rumor beliefs and resulting user actions during a community crisis. health rumors: an experiment with medical professionals. Comput.
Gov. Inf. Q. 35(2), 243–258 (2018) Hum. Behav. 87, 1–9 (2018)
16. Xiaoyi, A.T.C.Y.: Examining online vaccination discussion and 38. Murungi, D., Purao, S., Yates, D.: Beyond facts: a new spin on
communities in Twitter. In: SMSociety ’18: Proceedings of the 9th fake news in the age of social media. In: AMCIS 2018 Proceedings
International Conference on Social Media and Society (2018) (2018)
17. Lazer, D.M.J., et al.: The science of fake news. Sciencemag.org 39. Pennycook, G., Bear, A., Collins, E.T., Rand, D.G.: The implied
(2018) truth effect: attaching warnings to a subset of fake news head-
18. Peter, S.: More Americans are getting their news from social media. lines increases perceived accuracy of headlines without warnings.
forbes.com (2019) Manag. Sci. (2020). https://doi.org/10.1287/mnsc.2019.3478
19. Jang, S.M., et al.: A computational approach for examining the 40. Garrett, R., Poulsen, S.: Flagging Facebook falsehoods: self identi-
roots and spreading patterns of fake news: evolution tree analysis. fied humor warnings outperform fact checker and peer warnings. J.
Comput. Hum. Behav. 84, 103–113 (2018) Comput. Commun. (2019). https://doi.org/10.1093/jcmc/zmz012
20. Mele, N., et al.: Combating Fake News: An Agenda for Research 41. Shin, J., Thorson, K.: Partisan selective sharing: the biased diffu-
and Action (2017) sion of fact-checking messages on social media. J. Commun. 67(2),
21. Bernhard, U., Dohle, M.: Corrective or confirmative actions? Polit- 233–255 (2017)
ical online participation as a consequence of presumed media 42. Kim, A., Dennis, A.R.: Says who? The effects of presentation for-
influences in election campaigns. J. Inf. Technol. Polit. 12(3), mat and source rating on fake news in social media. MIS Q. (2019).
285–302 (2015) https://doi.org/10.25300/MISQ/2019/15188
22. Webster, J., Watson, R.T.: Analyzing the past to prepare for the 43. Hazel Kwon, K., Raghav Rao, H.: Cyber-rumor sharing under a
future: writing a literature review. MIS Q. 26(2) (2002) homeland security threat in the context of government Internet
23. aisnet.org: Senior Scholars’ Basket of Journals|AIS. aisnet.org. surveillance: the case of South–North Korea conflict. Gov. Inf. Q.
[Online]. Available: https://aisnet.org/page/SeniorScholarBasket. 34(2), 307–316 (2017)
Accessed: 16 Sept 2021 44. Paek, H.J., Hove, T.: Effective strategies for responding to rumors
24. Torres, R.R., Gerhart, N., Negahban, A.: Epistemology in the era about risks: the case of radiation-contaminated food in South
of fake news: an exploration of information verification behaviors Korea. Public Relat. Rev. 45(3), 101762 (2019)
among social networking site users. ACM 49, 78–97 (2018) 45. Moravec, P.L., Minas, R.K., Dennis, A.R.: Fake news on social
25. Abdullah, N.A., Nishioka, D., Tanaka, Y., Murayama, Y.: Why I media: people believe what they want to believe when it makes no
retweet? Exploring user’s perspective on decision-making of infor- sense at all. MIS Q. (2019). https://doi.org/10.25300/MISQ/2019/
mation spreading during disasters. In: Proceedings of the 50th 15505
Hawaii International Conference on System Sciences (2017) 46. Madraki et al.: Characterizing and comparing COVID-19 misin-
26. Lee, J., Agrawal, M., Rao, H.R.: Message diffusion through social formation across languages, countries and platforms. In: WWW
network service: the case of rumor and non-rumor related tweets ’21 Companion Proceedings of Web Conference (2021)
123
International Journal of Data Science and Analytics (2022) 13:271–285 285
47. Shahi, G.K., Dirkson, A., Majchrzak, T.A.: An exploratory study 59. Wang, Y., McKee, M., Torbica, A., Stuckler, D.: Systematic lit-
of COVID-19 misinformation on Twitter. Public Heal. Emerg. erature review on the spread of health-related misinformation on
COVID-19 Initiat. 22, 100104 (2021) social media. Soc. Sci. Med. 240, 112552 (2019)
48. Otala, M., et al.: Political polarization and platform migration: a 60. Pappa, D., Stergioulas, L.K.: Harnessing social media data for phar-
study of Parler and Twitter usage by United States of America macovigilance: a review of current state of the art, challenges and
Congress Members. In: WWW ’21 Companion Proceedings of future directions. Int. J. Data Sci. Anal. 8(2), 113–135 (2019)
Web Conference (2021) 61. BBC: Fighting Covid-19 fake news in Africa. BBC News (2020)
49. Paul, L.J.: Encyclopedia of Survey Research Methods. Sage 62. Chua, A.Y.K., Aricat, R., Goh, D.: Message content in the life
Research Methods, Thousand Oaks (2008) of rumors: comparing three rumor types. In: 2017 12th Interna-
50. WHO Munich Security Conference: WHO.int. [Online]. Avail- tional Conference on Digital Information Management, ICDIM
able: https://www.who.int/director-general/speeches/detail/ 2017, vol. 2018, pp. 263–268
munich-security-conference. Accessed 24 Sept 2021 63. Lee, A.R., Son, S.-M., Kim, K.K.: Information and communication
51. Coleman, A.: Hundreds dead’ because of Covid-19 misinforma- technology overload and social networking service fatigue: a stress
tion—BBC News. BBC News (2020) perspective. Comput. Hum. Behav. 55, 51–61 (2016)
52. Benenson, E.: Vaccine myths Facts vs fiction|VCU Health. 64. Foss, K., Foss, S., Griffin, C.: Feminist rhetorical theories (1999).
vcuhealth.org, 2021. [Online]. Available: https://www.vcuhealth. https://doi.org/10.1080/07491409.2000.10162571
org/news/covid-19/vaccine-myths-facts-vs-fiction. Accessed 24 65. Coombs, W., Holladay, S.J.: Reasoned action in crisis communi-
Sept 2021 cation: an attribution theory-based approach to crisis management.
53. Pierpoint, G.: Kerala floods: fake news ‘creating unnecessary Responding to Cris. A Rhetor. approach to Cris. Commun. (2004)
panic’—BBC News. BBC (2018) 66. Smith, H.J., Dinev, T., Xu, H.: Information privacy research: an
54. Campbell, D.J.: Task complexity: a review and analysis. Acad. interdisciplinary review. MIS Q. 35, 989–1015 (2011)
Manag. Rev. 13(1), 40 (1988) 67. Ammu, C.: Kerala: Kannur district teaches school kids to spot fake
55. Balan, M.U., Mathew, S.K.: Personalize, summarize or let them news—the week. theweek.in (2018)
read? A study on online word of mouth strategies and consumer 68. Ponniah, K.: WhatsApp: the ‘black hole’ of fake news in India’s
decision process. Inf. Syst. Front. 23, 1–21 (2020) election. BBC News (2019)
56. Shibutani, T.: Improvised News: A Sociological Study of Rumor.
The Bobbs-Merrill Company Inc, Indianapolis (1966)
57. Pezzo MV, Beckstead JW (2006) A multilevel analysis of
Publisher’s Note Springer Nature remains neutral with regard to juris-
rumor transmission: effects of anxiety and belief in two field
dictional claims in published maps and institutional affiliations.
experiments Basic Appl. Soc. Psychol. https://doi.org/10.1207/
s15324834basp2801_8
58. Li, Y.-J., Cheung, C.M.K. Shen, X.-L., Lee, M.K.O.: Health misin-
formation on social media: a literature review. In: Association for
Information Systems (2019)
123