Social Media
Social Media
accessible by the public, often leading to incorrect judgments, inciting negative public
sentiments, and posing major dangers to public safety and social order. Thus, scholars are also
concerned about the propagation of disinformation via social networks, which is unsettling and
inescapable today.
In the wake of these complex times, fake news is much more than a name for inaccurate
and misleading material masquerading as news. It has evolved into an emotive, weaponized
word to attack and discredit journalism. Misinformation and disinformation are objective social
phenomena that occur in social operations. It often refers to material that has been extensively
explanation. Also, it has been a source of concern in the social sciences, sociology, journalism,
With the advancement of internet technology and social media platforms, disinformation
propagated through word-of-mouth is swiftly distributed via social media platforms. It has the
properties of fission spread, rapid proliferation, a broad range of effects, and profound impact. A
multitude of false information, as well as the spread of rumors and misleading information on
social media platforms, causes not an only public concern and poses a threat to the public's
physical and psychological health but also poses serious challenges to the governance and
"information pandemic" familiar to the general public. The term "information pandemic" refers
to a set of physical and psychological responses that the general population experiences when
confronted with disinformation since it is impossible to determine the truth of the information,
and the spread of misinformation infiltrates everyone's life. During the COVID-19 outbreak, for
example, the World Health Organization saw combating the "information pandemic" as a vital
part of its mission. With the rise of social media, the reach of the "information plague" grew,
amplifying the harm presented by disinformation. For instance, when confronted with
disinformation, the uncertainty of the future, and a lack of access to knowledge, the public's
psychological pressure increases, generating anxiety and panic. At this moment, the public is
highly likely to be ensnared by the group, magnifying mass fear, sparking collective social
crises, and even leading to numerous social catastrophes, due to rumors and inaccurate
information. It has been shown that the damage caused by misinformation transmitted on social
media is more severe owing to social media features such as quick transmission speed, a broad
range of influence, and profound effects. Consequently, it is critical to comprehend the process
election in the United States, which was an uproarious issue. In the weeks and months until
November 8, social media platforms like Twitter and Facebook were inundated with "fake news"
(Howard, 2017). Following Donald Trump's election as the forty-fifth President of the United
States, investigations showed that considerable foreign influence had played a part throughout
the campaign, with efforts geared primarily at influencing the outcome of the election. Most
fingers pointed straight to Russia and President Vladimir Putin's administration as the most
It was far from the first time social media was used in influence operations. For example,
the Islamic State terrorist group (ISIS) employed large Twitter campaigns a few years ago to
promote propaganda, foster radicalization, and recruit foreign warriors for its battle in Syria and
Governments and non-state actors have both conducted influence operations long before
social media, but what is new about the modern extent, intensity, and effect of influence
operations are all expected to become more evident as digital platforms expand their reach over
the internet and become more fundamental to our social, economic, and political life.
Democracies, on the other hand, rely on open and unfettered information exchange and are
especially vulnerable to the poison of influence operations that distribute false news,
the premise of an educated population with a shared sense of facts, shared public narratives, and
firm confidence in the information supplied by institutions. This whole assemblage is under
attack from well-planned influence operations, which will only worsen when new "deep fake"
With that being said, small but significant adjustments are both achievable and required.
In general, combating the issue entails addressing the approach for fighting influence operations
and strengthening people's online "immunity" so they are less susceptible to misleading,
incorrect, and divisive information. Broad-based educational activities aimed at raising user
awareness of bogus information may be beneficial but are prohibitively expensive. Inoculating
critical points (people) inside a network is more effective and less expensive (Christakis &
Fowler, 2011). Targeted interaction with people at the center of networks (high network
centrality scores in social network analysis jargon) might assist develop herd immunity and
develop their narratives of events that can be used to counteract the influence operations of
others. The efficacy of such counter-narratives is dependent on the confidence that users have in
their sources; thus, acting quickly to stem the flow of disruptive influence operations aimed at
through two complementary techniques. First, platforms increasingly use massive user bases to
encourage individuals to flag and report potentially offensive material. The platforms then
review the material that was highlighted. If it violates a platform's terms of service or community
rules, the content may be deleted, and the account submitted can be banned. In addition to these
and remove information. With additional data, these techniques will continue to improve by
REFERENCES:
Allcott, Hunt and Matthew Gentzkow. 2017. “Social Media and Fake News in the 2016
https://web.stanford.edu/~gentzkow/research/fakenews.pdf.
Christakis, Nicholas A. and James H. Fowler. 2011. Connected: The Surprising Power of Our
Social Networks and How They Shape Our Lives — How Your Friends’ Friends’ Friends
Affect Everything You Feel, Think, and Do. New York, NY: Back Bay Books.
https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0175799&type=pri
ntable.
Fisher, Marc, John Woodrow Cox and Peter Hermann. 2016. “Pizzagate: From rumor, to
Frank, Russell. 2015. “Caveat Lector: Fake News as Folklore.” The Journal of American
www.researchgate.net/publication/281601869_Caveat_Lector_Fake_News_as_Folklore.
Giles, Martin. 2019. “Five emerging cyber-threats to worry about in 2019.” MIT Technology
Review. www.technologyreview.com/s/612713/five-emerging-cyber-threats-2019/.
L. Guo and Y. Zhang (2020), “Information flow within and across online media platforms: an
L. Li, H. Xia, R. Zhang, and Y. Li, (2019) “DDSEIR: a dynamic rumor spreading model in
G. Pennycook and D. G. Rand (2019) “Lazy, not biased: susceptibility to partisan fake news is
better explained by lack of reasoning than by motivated reasoning,” Cognition, vol. 188,
pp. 39–50.