Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article
Open access

Fakey: A Game Intervention to Improve News Literacy on Social Media

Published: 22 April 2021 Publication History

Abstract

We designed and developed Fakey, a game to improve news literacy and reduce misinformation spread by emulating a social media feed. We analyzed player interactions with articles in the feed collected over 19 months within a real-world deployment of the game. We found that Fakey is effective in priming players to be suspicious of articles from questionable sources. Players who interact with more articles in the game enhance their skills in spotting mainstream content, thus confirming the utility of Fakey for improving news literacy. Semi-structured interviews with those who played the game revealed that players find it simple, fun, and educational. The principles and mechanisms used by Fakey can inform the design of social media functionality to help people distinguish between credible and questionable content in their news feeds.

References

[1]
Michelle A. Amazeen, Emily Thorson, Ashley Muddiman, and Lucas Graves. 2018. Correcting political and consumer Misperceptions: The efectiveness and efects of rating scale versus contextual correction formats. Journalism & Mass Communication Quarterly 95, 1 (2018), 28--48. https://doi.org/10.1177/1077699016678186
[2]
Ahmer Arif, Leo Graiden Stewart, and Kate Starbird. 2018. Acting the part: Examining information operations within #BlackLivesMatter discourse. Proc. ACM Hum.-Comput. Interact. 2, CSCW, Article 20 (Nov. 2018), 27 pages. https://doi.org/10.1145/3274289
[3]
Patricia Aufderheide. 1993. Media literacy. A report of the National Leadership Conference on Media Literacy. Education Resources Information Center ED365294 (1993), 44 pages. https://fles.eric.ed.gov/fulltext/ED365294.pdf
[4]
Mihai Avram, Nicholas Micallef, Sameer Patil, and Filippo Menczer. 2020. Exposure to social engagement metrics increases vulnerability to misinformation. The Harvard Kennedy School Misinformation Review 1, 5 (2020), 1--11. https://doi.org/10.37016/mr-2020-033
[5]
Melisa Basol, Jon Roozenbeek, and Sander van der Linden. 2020. Good news about bad news: Gamifed inoculation boosts confdence and cognitive immunity against fake news. Journal ofcognition 3, 1 (2020), 1--9. https://doi.org/10.5334/joc.91
[6]
Md. Momen Bhuiyan, Kexin Zhang, Kelsey Vick, Michael A. Horning, and Tanushree Mitra. 2018. FeedRefect: A tool for nudging users to assess news credibility on Twitter. In Companion of the 2018 ACM Conference on Computer Supported Cooperative Work and Social Computing (Jersey City, NJ, USA) (CSCW '18). Association for Computing Machinery, New York, NY, USA, 205--208. https://doi.org/10.1145/3272973.3274056
[7]
Leticia Bode and Emily K. Vraga. 2015. In related news, that was wrong: The correction of misinformation through related stories functionality in social media. Journal of Communication 65, 4 (Jun 2015), 619--638. https://doi.org/10. 1111/jcom.12166
[8]
Alexandre Bovet and Hernán A. Makse. 2019. Infuence of fake news in Twitter during the 2016 US presidential election. Nature Communications 10, 1 (2019), 1--14. https://doi.org/10.1038/s41467-018-07761-2
[9]
Ceren Budak, Divyakant Agrawal, and Amr El Abbadi. 2011. Limiting the spread of misinformation in social networks. In Proceedings of the 20th International Conference on World Wide Web (Hyderabad, India) (WWW '11). Association for Computing Machinery, New York, NY, USA, 665--674. https://doi.org/10.1145/1963405.1963499
[10]
Robert Chesney and Danielle Citron. 2019. Deepfakes and the new disinformation war: The coming age of post-truth geopolitics. Foreign Afairs 98, 1 (2019).
[11]
Katherine Clayton, Spencer Blair, Jonathan A. Busam, Samuel Forstner, John Glance, Guy Green, Anna Kawata, Akhila Kovvuri, Jonathan Martin, Evan Morgan, Morgan Sandhu, Rachel Sang, Rachel Scholz-Bright, Austin T. Welch, Andrew G. Wolf, Amanda Zhou, and Brendan Nyhan. 2020. Real solutions for fake news? Measuring the efectiveness of general warnings and fact-check tags in reducing belief in false stories on social media. Political Behavior 42, 4 (2020), 1073--1095. https://doi.org/10.1007/s11109-019-09533-0
[12]
Michela Del Vicario, Alessandro Bessi, Fabiana Zollo, Fabio Petroni, Antonio Scala, Guido Caldarelli, H. Eugene Stanley, and Walter Quattrociocchi. 2016. The spreading of misinformation online. Proceedings of the National Academy of Sciences 113, 3 (2016), 554--559. https://doi.org/10.1073/pnas.1517441113
[13]
Nicholas Dias, Gordon Pennycook, and David G. Rand. 2020. Emphasizing publishers does not efectively reduce susceptibility to misinformation on social media. The Harvard Kennedy School Misinformation Review 1, 1 (2020), 1--12. https://doi.org/10.37016/mr-2020-001
[14]
Lisa Fazio. 2020. Pausing to consider why a headline is true or false can help reduce the sharing of false news. The Harvard Kennedy School Misinformation Review 1, 2 (2020), 1--8. https://doi.org/10.37016/mr-2020-009
[15]
Seth Flaxman, Sharad Goel, and Justin M. Rao. 2016. Filter bubbles, echo chambers, and online news consumption. Public Opinion Quarterly 80, S1 (Mar 2016), 298--320. https://doi.org/10.1093/poq/nfw006
[16]
Brian Jefrey Fogg. 2002. Persuasive technology: Using computers to change what we think and do (1 ed.). Number December. Morgan Kaufmann, New York, NY, USA. https://doi.org/10.1145/764008.763957
[17]
Alain Forget, Sonia Chiasson, and Robert Biddle. 2009. Lessons from brain age on persuasion for computer security. In CHI '09 Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA) (CHI EA '09). Association for Computing Machinery, New York, NY, USA, 4435--4440. https://doi.org/10.1145/1520340.1520679
[18]
Adrien Friggeri, Lada Adamic, Dean Eckles, and Justin Cheng. 2014. Rumor cascades. Proceedings of the International AAAI Conference on Web and Social Media 8, 1 (2014). https://ojs.aaai.org/index.php/ICWSM/article/view/14559
[19]
Catherine O. Fritz, Peter E. Morris, and Jennifer J. Richler. 2012. Efect size estimates: Current use, calculations, and interpretation. Journal of experimental psychology: General 141, 1 (2012), 2--18. https://doi.org/10.1037/a0024338
[20]
Kiran Garimella, Gianmarco De Francisci Morales, Aristides Gionis, and Michael Mathioudakis. 2017. Reducing controversy by connecting opposing views. In Proceedings of the Tenth ACM International Conference on Web Search and Data Mining (Cambridge, United Kingdom) (WSDM '17). Association for Computing Machinery, New York, NY, USA, 81--90. https://doi.org/10.1145/3018661.3018703
[21]
R. Kelly Garrett and Brian E. Weeks. 2013. The promise and peril of real-time corrections to political misperceptions. Association for Computing Machinery, New York, NY, USA, 1047--1058. https://doi.org/10.1145/2441776.2441895
[22]
Maria Glenski, Corey Pennycuf, and Tim Weninger. 2017. Consumers and curators: Browsing and voting patterns on Reddit. IEEE Transactions on Computational Social Systems 4, 4 (2017), 196--206. https://doi.org/10.1109/TCSS.2017. 2742242
[23]
Samuel Greengard. 2020. Will deepfakes do deep damage? Commun. ACM 63, 1 (2020), 17--19.
[24]
Nir Grinberg, Kenneth Joseph, Lisa Friedland, Briony Swire-Thompson, and David Lazer. 2019. Fake news on Twitter during the 2016 US presidential election. Science 363, 6425 (2019), 374--378. https://doi.org/10.1126/science.aau2706
[25]
Andrew Guess, Jonathan Nagler, and Joshua Tucker. 2019. Less than you think: Prevalence and predictors of fake news dissemination on Facebook. Science Advances 5, 1 (2019), 1--8. https://doi.org/10.1126/sciadv.aau4586
[26]
Andrew Guess, Brendan Nyhan, and Jason Reifer. 2018. Selective exposure to misinformation: Evidence from the consumption of fake news during the 2016 US presidential campaign. European Research Council 9, 3 (2018), 1--14.
[27]
Jimmeka J. Guillory and Lisa Geraci. 2013. Correcting erroneous inferences in memory: The role of source credibility. Journal of Applied Research in Memory and Cognition 2, 4 (2013), 201--209. https://doi.org/10.1016/j.jarmac.2013.10.001
[28]
Aditi Gupta, Hemank Lamba, Ponnurangam Kumaraguru, and Anupam Joshi. 2013. Faking Sandy: Characterizing and identifying fake images on Twitter during Hurricane Sandy. In Proceedings of the 22nd International Conference on World Wide Web (Rio de Janeiro, Brazil) (WWW '13 Companion). Association for Computing Machinery, New York, NY, USA, 729--736. https://doi.org/10.1145/2487788.2488033
[29]
Michael Hameleers, Thomas E. Powell, Toni G. L. A. Van Der Meer, and Lieke Bos. 2020. A picture paints a thousand lies? The efects and mechanisms of multimodal disinformation and rebuttals disseminated via social media. Political Communication 37, 2 (2020), 281--301. https://doi.org/10.1080/10584609.2019.1674979
[30]
Michael Hameleers and Toni G. L. A. van der Meer. 2020. Misinformation and polarization in a high-Choice media environment: How efective are political fact-Checkers? Communication Research 47, 2 (2020), 227--250. https: //doi.org/10.1177/0093650218819671
[31]
Jane Im, Sonali Tandon, Eshwar Chandrasekharan, Taylor Denby, and Eric Gilbert. 2020. Synthesized social signals: Computationally-derived social signals from account histories. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI '20). Association for Computing Machinery, New York, NY, USA, 1--12. https://doi.org/10.1145/3313831.3376383
[32]
Dean Jackson. 2017. Issue brief: Distinguishing disinformation from propaganda, misinformation, and "fake news". National Endowment for Democracy (2017). https://www.ned.org/issue-brief-distinguishing-disinformation-from-propaganda-misinformation-and-fake-news
[33]
Maurice Jakesch, Moran Koren, Anna Evtushenko, and Mor Naaman. 2019. The role ofsource and expressive responding in political news evaluation. Presented at Computation and Journalism Symposium. https://people.jacobs.cornell.edu/ mor/publications/thegoods/jakesch_cplusj19__trustnews.pdf
[34]
Antonis Kalogeropoulos, Richard Fletcher, and Rasmus Kleis Nielsen. 2019. News brand attribution in distributed environments: Do people know where they get their news? New Media & Society 21, 3 (2019), 583--601. https: //doi.org/10.1177/1461444818801313
[35]
Adam D. I. Kramer, Jamie E. Guillory, and Jefrey T. Hancock. 2014. Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Sciences 111, 24 (2014), 8788--8790. https://doi.org/10.1073/pnas.1320040111
[36]
Srijan Kumar and Neil Shah. 2018. False information on Web and social media: A survey. In Social Media Analytics: Advances and Applications. CRC Press.
[37]
Srijan Kumar, Robert West, and Jure Leskovec. 2016. Disinformation on the web: Impact, characteristics, and detection of Wikipedia hoaxes. In Proceedings of the 25th International Conference on World Wide Web (Montréal, Québec, Canada) (WWW '16). International World Wide Web Conferences Steering Committee, Republic and Canton of Geneva, CHE, 591--602. https://doi.org/10.1145/2872427.2883085
[38]
David M. J. Lazer, Matthew A. Baum, Yochai Benkler, Adam J. Berinsky, Kelly M. Greenhill, Filippo Menczer, Miriam J. Metzger, Brendan Nyhan, Gordon Pennycook, David Rothschild, Michael Schudson, Steven A. Sloman, Cass R. Sunstein, Emily A. Thorson, Duncan J. Watts, and Jonathan L. Zittrain. 2018. The science of fake news. Science 359, 6380 (2018), 1094--1096. https://doi.org/10.1126/science.aao2998
[39]
Shu-Yueh Lee, Sara Stefes Hansen, and Jin Kyun Lee. 2016. What makes us click "like" on Facebook? Examining psychological, technological, and motivational factors on virtual endorsement. Computer Communications 73 (2016), 332--341. https://doi.org/10.1016/j.comcom.2015.08.002 Online Social Networks.
[40]
Stephan Lewandowsky, Ullrich K. H. Ecker, and John Cook. 2017. Beyond misinformation: Understanding and coping with the "post-truth" era. Journal of Applied Research in Memory and Cognition 6, 4 (2017), 353--369. https: //doi.org/10.1016/j.jarmac.2017.07.008
[41]
Sonia Livingstone. 2004. Media literacy and the challenge of new information and communication technologies. The Communication Review 7, 1 (2004), 3--14. https://doi.org/10.1080/10714420490280152
[42]
Lauren Lutzke, Caitlin Drummond, Paul Slovic, and Joseph Árvai. 2019. Priming critical thinking: Simple interventions limit the infuence of fake news about climate change on Facebook. Global Environmental Change 58, 101964 (2019), 1--8. https://doi.org/10.1016/j.gloenvcha.2019.101964
[43]
Drew B. Margolin, Aniko Hannak, and Ingmar Weber. 2018. Political fact-Checking on Twitter: When do corrections have an effect? Political Communication 35, 2 (2018), 196--219. https://doi.org/10.1080/10584609.2017.1334018
[44]
Sarah McGrew. 2020. Learning to evaluate: An intervention in civic online reasoning. Computers & Education 145, 103711 (2020), 1--13. https://doi.org/10.1016/j.compedu.2019.103711
[45]
Sarah McGrew, Joel Breakstone, Teresa Ortega, Mark Smith, and Sam Wineburg. 2018. Can students evaluate online sources? Learning from assessments of civic online reasoning. Theory & Research in Social Education 46, 2 (2018), 165--193. https://doi.org/10.1080/00933104.2017.1416320
[46]
Solomon Messing and Sean J. Westwood. 2014. Selective exposure in the age of social media: Endorsements trump partisan source afliation when selecting news online. Communication Research 41, 8 (2014), 1042--1063. https: //doi.org/10.1177/0093650212466406
[47]
Nicholas Micallef and Nalin Asanka Gamagedara Arachchilage. 2017. A gamifed approach to improve users' memorability of fall-back authentication. In Thirteenth Symposium on Usable Privacy and Security (Santa Clara, CA) (SOUPS 2017). USENIX Association. https://www.usenix.org/conference/soups2017/workshop-program/way2017/micallef
[48]
Nicholas Micallef and Nalin Asanka Gamagedara Arachchilage. 2018. Security questions education: Exploring gamifed features and functionalities. Information & Computer Security 26, 3 (2018), 365--378. https://doi.org/10.1108/ICS-03-2018-0033
[49]
Nicholas Micallef, Bing He, Srijan Kumar, Mustaque Ahamad, and Nasir Memon. 2020. The role of the crowd in countering misinformation: A case study of the COVID-19 infodemic. In 2020 IEEE International Conference on Big Data (BigData 2020). IEEE. https://arxiv.org/abs/2011.05773
[50]
VidyaNarayanan, Vlad Barash, John Kelly, Bence Kollanyi, Lisa-MariaNeudert, and PhilipN. Howard. 2018. Polarization, partisanship and junk news consumption over social media in the US. arXiv:1803.01845 [cs.SI]
[51]
Raymond S. Nickerson. 1998. Confrmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology 2, 2 (1998), 175--220. https://doi.org/10.1037/1089--2680.2.2.175
[52]
Thomas Nygren, Fredrik Brounéus, and Göran Svensson. 2019. Diversity and credibility in young people's news feeds: A foundation for teaching and learning citizenship in a digital era. Journal of Social Science Education 18, 2 Summer 2019 (2019), 87--109. https://doi.org/10.4119/jsse-917
[53]
Brendan Nyhan, Ethan Porter, Jason Reifer, and Thomas J. Wood. 2020. Taking fact-Checks literally but not seriously? The efects of journalistic fact-checking on factual beliefs and candidate favorability. Political Behavior 42, 3 (2020), 939--960. https://doi.org/10.1007/s11109-019-09528-x
[54]
Brendan Nyhan and Jason Reifer. 2010. When corrections fail: The persistence of political misperceptions. Political Behavior 32, 2 (2010), 303--330. https://doi.org/10.1007/s11109-010-9112-2
[55]
Anne Oeldorf-Hirsch and S. Shyam Sundar. 2015. Posting, commenting, and tagging: Efects of sharing news stories on Facebook. Computers in Human Behavior 44 (2015), 240--249. https://doi.org/10.1016/j.chb.2014.11.024
[56]
Pinar Ozturk, Huaye Li, and Yasuaki Sakamoto. 2015. Combating rumor spread on social media: The efectiveness of refutation and warning. In 2015 48th Hawaii International Conference on System Sciences. 2406--2414. https: //doi.org/10.1109/HICSS.2015.288
[57]
Gordon Pennycook, Adam Bear, Evan T. Collins, and David G. Rand. 2020. The implied truth efect: Attaching warnings to a subset of fake news headlines increases perceived accuracy of headlines without warnings. Management Science 66, 11 (2020), 4944--4957. https://doi.org/10.1287/mnsc.2019.3478
[58]
Gordon Pennycook, Jonathon McPhetres, Yunhao Zhang, Jackson G. Lu, and David G. Rand. 2020. Fighting COVID-19 misinformation on social media: Experimental evidence for a scalable accuracy-nudge intervention. Psychological Science 31, 7 (2020), 770--780. https://doi.org/10.1177/0956797620939054
[59]
Gordon Pennycook and David G. Rand. 2019. Fighting misinformation on social media using crowdsourced judgments of news source quality. Proceedings of the National Academy of Sciences 116, 7 (2019), 2521--2526. https://doi.org/10. 1073/pnas.1806781116
[60]
Gordon Pennycook and David G. Rand. 2020. Who falls for fake news? The roles of bullshit receptivity, overclaiming, familiarity, and analytic thinking. Journal of Personality 88, 2 (2020), 185--200. https://doi.org/10.1111/jopy.12476
[61]
Verónica Pérez-Rosas, Bennett Kleinberg, Alexandra Lefevre, and Rada Mihalcea. 2018. Automatic detection of fake news. In Proceedings of the 27th International Conference on Computational Linguistics. Association for Computational Linguistics, Santa Fe, New Mexico, USA, 3391--3401. https://www.aclweb.org/anthology/C18--1287
[62]
Walter Quattrociocchi, Antonio Scala, and Cass R. Sunstein. 2016. Echo chambers on Facebook. Social Science Research Network (2016), 1--15. https://doi.org/10.2139/ssrn.2795110
[63]
Jon Roozenbeek and Sander van der Linden. 2019. The fake news game: Actively inoculating against the risk of misinformation. Journal of Risk Research 22, 5 (2019), 570--580. https://doi.org/10.1080/13669877.2018.1443491
[64]
Jon Roozenbeek and Sander van der Linden. 2019. Fake news game confers psychological resistance against online misinformation. Palgrave Communications 5, 1, Article 65 (2019), 10 pages. https://doi.org/10.1057/s41599-019-0279-9
[65]
Chengcheng Shao, Giovanni Luca Ciampaglia, Onur Varol, Kai-Cheng Yang, Alessandro Flammini, and Filippo Menczer. 2018. The spread of low-credibility content by social bots. Nature Communications 9, 4787 (2018), 1--9. https://doi.org/10.1038/s41467-018-06930-7
[66]
Chengcheng Shao, Pik-Mai Hui, Lei Wang, Xinwen Jiang, Alessandro Flammini, Filippo Menczer, and Giovanni Luca Ciampaglia. 2018. Anatomy of an online misinformation network. PLOS ONE 13, 4 (Apr 2018), 1--23. https: //doi.org/10.1371/journal.pone.0196087
[67]
Cuihua Shen, Mona Kasra, Wenjing Pan, Grace A. Bassett, Yining Malloch, and James F. O'Brien. 2019. Fake images: The efects of source, intermediary, and digital media literacy on contextual assessment of image credibility online. New Media & Society 21, 2 (2019), 438--463. https://doi.org/10.1177/1461444818799526
[68]
Jieun Shin and Kjerstin Thorson. 2017. Partisan selective sharing: The biased difusion of fact-checking messages on social media. Journal of Communication 67, 2 (Feb 2017), 233--255. https://doi.org/10.1111/jcom.12284
[69]
Kai Shu, Amy Sliva, Suhang Wang, Jiliang Tang, and Huan Liu. 2017. Fake news detection on social media: A data mining perspective. SIGKDD Explor. Newsl. 19, 1 (Sep 2017), 22--36. https://doi.org/10.1145/3137597.3137600
[70]
Craig Silverman. 2015. Lies, damn lies, and viral content. Tow Center for Digital Journalism, Columbia Journalism School A Tow/Knight Report (2015). https://doi.org/10.7916/D8Q81RHH
[71]
Craig Silverman. 2016. This analysis shows how viral fake election news stories outperformed real news on Facebook. BuzzFeed News (2016). https://www.buzzfeednews.com/article/craigsilverman/viral-fake-election-news-outperformed-real-news-on-facebook
[72]
Kate Starbird. 2019. Disinformation's spread: Bots, trolls and all of us. Nature 571, 7766 (2019).
[73]
Kate Starbird, Jim Maddock, Mania Orand, Peg Achterman, and Robert M. Mason. 2014. Rumors, false fags, and digital vigilantes: Misinformation on Twitter after the 2013 Boston marathon bombing. In Proceedings of iConference 2014. iSchools. https://doi.org/10.9776/14308
[74]
David Sterrett, Dan Malato, Jennifer Benz, Liz Kantor, Trevor Tompson, Tom Rosenstiel, Jef Sonderman, and Kevin Loker. 2019. Who shared it?: Deciding what news to trust on social media. Digital Journalism 7, 6 (2019), 783--801. https://doi.org/10.1080/21670811.2019.1623702
[75]
Leo G. Stewart, Ahmer Arif, and Kate Starbird. 2018. Examining trolls and polarization with a retweet network. In Proceedings of MIS2: Misinformation and Misbehavior Mining on the Web (Workshop held in conjunction with WSDM 2018). http://snap.stanford.edu/mis2/fles/MIS2_paper_21.pdf
[76]
Charles S. Taber and Milton Lodge. 2006. Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science 50, 3 (2006), 755--769. https://doi.org/10.1111/j.1540-5907.2006.00214.x
[77]
Emily Thorson. 2016. Belief echoes: The persistent efects of corrected misinformation. Political Communication 33, 3 (2016), 460--480. https://doi.org/10.1080/10584609.2015.1102187
[78]
Sander van der Linden. 2017. Beating the hell out of fake news. Ethical Record: Proceedings of the Conway Hall Ethical Society 122, 6 (2017), 4--7. https://ssrn.com/abstract=3089590
[79]
Soroush Vosoughi, Deb Roy, and Sinan Aral. 2018. The spread of true and false news online. Science 359, 6380 (2018), 1146--1151. https://doi.org/10.1126/science.aap9559
[80]
Emily K. Vraga and Leticia Bode. 2017. Using expert sources to correct health misinformation in social media. Science Communication 39, 5 (2017), 621--645. https://doi.org/10.1177/1075547017731776
[81]
Emily K. Vraga and Leticia Bode. 2018. I do not believe you: How providing a source corrects health misperceptions across social media platforms. Information, Communication & Society 21, 10 (2018), 1337--1353. https://doi.org/10. 1080/1369118X.2017.1313883
[82]
Emily K. Vraga, Leticia Bode, and Melissa Tully. 2020. Creating news literacy messages to enhance expert corrections of misinformation on Twitter. Communication Research (Jan 2020), 1--23. https://doi.org/10.1177/0093650219898094
[83]
Thomas Wood and Ethan Porter. 2019. The elusive backfre efect: Mass attitudes' steadfast factual adherence. Political Behavior 41, 1 (2019), 135--163. https://doi.org/10.1007/s11109-018-9443-y
[84]
Waheeb Yaqub, Otari Kakhidze, Morgan L. Brockman, Nasir Memon, and Sameer Patil. 2020. Efects of credibility indicators on social media news sharing intent. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI '20). Association for Computing Machinery, New York, NY, USA, 1--14. https://doi.org/10.1145/3313831.3376213
[85]
Dannagal G. Young, Kathleen Hall Jamieson, Shannon Poulsen, and Abigail Goldring. 2018. Fact-checking efectiveness as a function of format and tone: Evaluating Factcheck.org and Flackcheck.org. Journalism & Mass Communication Quarterly 95, 1 (2018), 49--75. https://doi.org/10.1177/1077699017710453

Cited By

View all
  • (2024)Effectiveness of WhatsApp based debunking reminders on follow-up visit attendance for individuals with hypertension: a randomized controlled trial in IndiaBMC Public Health10.1186/s12889-024-19894-924:1Online publication date: 9-Sep-2024
  • (2024)Promoting End-User Security and Privacy Through Serious Game DevelopmentCompanion Proceedings of the 2024 Annual Symposium on Computer-Human Interaction in Play10.1145/3665463.3678848(409-412)Online publication date: 14-Oct-2024
  • (2024)Tell Me What You Like and I Know What You Will Share: Topical Interest Influences Behavior Toward News From High and Low Credible Sources2024 IEEE European Symposium on Security and Privacy Workshops (EuroS&PW)10.1109/EuroSPW61312.2024.00062(504-518)Online publication date: 8-Jul-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Proceedings of the ACM on Human-Computer Interaction
Proceedings of the ACM on Human-Computer Interaction  Volume 5, Issue CSCW1
CSCW
April 2021
5016 pages
EISSN:2573-0142
DOI:10.1145/3460939
Issue’s Table of Contents
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives International 4.0 License.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 22 April 2021
Published in PACMHCI Volume 5, Issue CSCW1

Check for updates

Author Tags

  1. fake news
  2. game
  3. low-credibility content
  4. misinformation
  5. news feed
  6. news literacy
  7. social media

Qualifiers

  • Research-article

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)1,035
  • Downloads (Last 6 weeks)107
Reflects downloads up to 16 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Effectiveness of WhatsApp based debunking reminders on follow-up visit attendance for individuals with hypertension: a randomized controlled trial in IndiaBMC Public Health10.1186/s12889-024-19894-924:1Online publication date: 9-Sep-2024
  • (2024)Promoting End-User Security and Privacy Through Serious Game DevelopmentCompanion Proceedings of the 2024 Annual Symposium on Computer-Human Interaction in Play10.1145/3665463.3678848(409-412)Online publication date: 14-Oct-2024
  • (2024)Tell Me What You Like and I Know What You Will Share: Topical Interest Influences Behavior Toward News From High and Low Credible Sources2024 IEEE European Symposium on Security and Privacy Workshops (EuroS&PW)10.1109/EuroSPW61312.2024.00062(504-518)Online publication date: 8-Jul-2024
  • (2024)Co-designing and pilot testing a digital game to improve vaccine attitudes and misinformation resistance in GhanaHuman Vaccines & Immunotherapeutics10.1080/21645515.2024.240720420:1Online publication date: Oct-2024
  • (2024) Bad News in the civics classroom: How serious gameplay fosters teenagers’ ability to discern misinformation techniques Journal of Research on Technology in Education10.1080/15391523.2024.2338451(1-27)Online publication date: 19-Apr-2024
  • (2024)Designing an Effective Fact-checking Education Program: the Complementary Relationship between Games and Lectures in Teaching Media LiteracyComputers & Education10.1016/j.compedu.2024.105136(105136)Online publication date: Aug-2024
  • (2024)Human–computer interaction tools with gameful design for critical thinking the media ecosystem: a classification frameworkAI & Society10.1007/s00146-022-01583-z39:3(1317-1329)Online publication date: 1-Jun-2024
  • (2023)The (Mis)Information Game: A social media simulatorBehavior Research Methods10.3758/s13428-023-02153-x56:3(2376-2397)Online publication date: 11-Jul-2023
  • (2023)COVID-19 News Exposure and Vaccinations: A Moderated Mediation of Digital News Literacy Behavior and Vaccine MisperceptionsInternational Journal of Environmental Research and Public Health10.3390/ijerph2001089120:1(891)Online publication date: 3-Jan-2023
  • (2023)Reviewing Interventions to Address Misinformation: The Need to Expand Our Vision Beyond an Individualistic FocusProceedings of the ACM on Human-Computer Interaction10.1145/35795207:CSCW1(1-34)Online publication date: 16-Apr-2023
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Get Access

Login options

Full Access

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media