Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article
Open access

Reconsidering Self-Moderation: the Role of Research in Supporting Community-Based Models for Online Content Moderation

Published: 15 October 2020 Publication History
  • Get Citation Alerts
  • Abstract

    Research in online content moderation has a long history of exploring different forms that moderation can take, including both user-driven moderation models on community-based platforms like Wikipedia, Facebook Groups, and Reddit, and centralized corporate moderation models on platforms like Twitter and Instagram. In this work I review different approaches to moderation research with the goal of providing a roadmap for researchers studying community self-moderation. I contrast community-based moderation research with platforms and policies-focused moderation research, and argue that the former has an important role to play in shaping discussions about the future of online moderation. I provide six guiding questions for future research that, if answered, can support the development of a form of user-driven moderation that is widely implementable across a variety of social spaces online, offering an alternative to the corporate moderation models that dominate public debate and discussion.

    References

    [1]
    Jan Philipp Albrecht. 2016. How the GDPR Will Change the World. European Data Protection Law Review 2, 3 (2016), 3. https://doi.org/10.21552/EDPL/2016/3/4
    [2]
    Eric Beerbohm. 2015. Is Democratic Leadership Possible? American Political Science Review 109, 4 (2015), 639--652. https://doi.org/10.1017/S0003055415000398
    [3]
    Yochai Benkler. 2016. Peer production and cooperation. In Handbook on the Economics of the Internet, Johannes M Bauer and Michael Latzer (Eds.). Edward Elgar Publishing, Cheltenham, United Kingdom, 91--119.
    [4]
    Yochai Benkler, Aaron Shaw, and Benjamin Mako Hill. 2015. Peer Production: A Form of Collective Intelligence. In Handbook of Collective Intelligence, Thomas Malone and Michael Bernstein (Eds.). MIT Press, Cambridge, MA, USA, 175--204.
    [5]
    Matt Billings and Leon A. Watts. 2010. Understanding Dispute Resolution Online: Using Text to Reflect Personal and Substantive Issues in Conflict. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '10). Association for Computing Machinery, New York, NY, USA, 1447--1456. https://doi.org/10.1145/1753326.1753542
    [6]
    Reuben Binns, Michael Veale, Max Van Kleek, and Nigel Shadbolt. 2017. Like trainer, like bot? Inheritance of bias in algorithmic content moderation. In Social Informatics. SocInfo 2017. Lecture Notes in Computer Science, vol 10540, G. Ciampaglia, A. Mashhadi, and T. Yasseri (Eds.). Springer, Cham, Switzerland. https://doi.org/10.1007/978--3--319--67256--4_32
    [7]
    Lindsay Blackwell, Jill Dimond, Sarita Schoenebeck, and Cliff Lampe. 2017. Classification and Its Consequences for Online Harassment: Design Insights from HeartMob. Proc. ACM Hum.-Comput. Interact. 1, CSCW, Article 24 (Dec. 2017), 19 pages. https://doi.org/10.1145/3134659
    [8]
    Pete Burnap and Matthew L. Williams. 2015. Cyber Hate Speech on Twitter: An Application of Machine Classification and Statistical Modeling for Policy and Decision Making. Policy & Internet 7, 2 (2015), 223--242. https://doi.org/10. 1002/poi3.85
    [9]
    Jie Cai and Donghee Yvette Wohn. 2019. What Are Effective Strategies of Handling Harassment on Twitch? Users? Perspectives. In Conference Companion Publication of the 2019 on Computer Supported Cooperative Work and Social Computing (CSCW '19). Association for Computing Machinery, New York, NY, USA, 166--170. https: //doi.org/10.1145/3311957.3359478
    [10]
    Stevie Chancellor, Jessica Annette Pater, Trustin Clear, Eric Gilbert, and Munmun De Choudhury. 2016. # thyghgapp: Instagram Content Moderation and Lexical Variation in Pro-Eating Disorder Communities. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing (CSCW '16). Association for Computing Machinery, New York, NY, USA, 1201--1213. https://doi.org/10.1145/2818048.2819963
    [11]
    Eshwar Chandrasekharan, Chaitrali Gandhi, Matthew Wortley Mustelier, and Eric Gilbert. 2019. Crossmod: A Cross-Community Learning-Based System to Assist Reddit Moderators. Proc. ACM Hum.-Comput. Interact. 3, CSCW, Article 174 (Nov. 2019), 30 pages. https://doi.org/10.1145/3359276
    [12]
    Eshwar Chandrasekharan, Umashanthi Pavalanathan, Anirudh Srinivasan, Adam Glynn, Jacob Eisenstein, and Eric Gilbert. 2017. You Can't Stay Here: The Efficacy of Reddit's 2015 Ban Examined Through Hate Speech. Proc. ACM Hum.-Comput. Interact. 1, CSCW, Article 31 (Dec. 2017), 22 pages. https://doi.org/10.1145/3134666
    [13]
    Eshwar Chandrasekharan, Mattia Samory, Anirudh Srinivasan, and Eric Gilbert. 2017. The Bag of Communities: Identifying Abusive Behavior Online with Preexisting Internet Data. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, New York, NY, USA, 3175--3187. https://doi.org/10.1145/ 3025453.3026018
    [14]
    Danielle Keats Citron. 2014. Hate Crimes in Cyberspace. Harvard University Press, Cambridge, MA, USA.
    [15]
    Danielle Keats Citron and Benjamin Wittes. 2017. The internet will not break: Denying bad samaritans sec. 230 immunity. Fordham L. Rev. 86 (2017), 401.
    [16]
    Danielle Keats Citron and Benjamin Wittes. 2018. The Problem Isn't Just Backpage: Revising Section 230 Immunity. Georgetown Law Technology Review (2018) 2 (23 July 2018), 453--473. Issue 2.
    [17]
    Dan Cosley, Dan Frankowski, Sara Kiesler, Loren Terveen, and John Riedl. 2005. How Oversight Improves Membermaintained Communities. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '05). ACM, New York, NY, USA, 11--20. https://doi.org/10.1145/1054972.1054975
    [18]
    Kate Crawford and Tarleton Gillespie. 2016. What is a flag for? Social media reporting tools and the vocabulary of complaint. New Media & Society 18, 3 (2016), 410--428. https://doi.org/10.1177/1461444814543163
    [19]
    Julian Dibbell. 1993. A Rape in Cyberspace: How an Evil Clown, a Haitian Trickster Spirit, Two Wizards, and a Cast of Dozens Turned a Database Into a Society. The Village Voice December 23 (1993), 36--42. https://www.villagevoice.com/2005/10/18/a-rape-in-cyberspace/
    [20]
    Judith Donath. 1999. Identity and Deception in the Virtual Community. In Communities in Cyberspace (1st ed.), Marc A Smith and Peter Kollock (Eds.). Routledge, London, UK, 27--58. https://doi.org/10.1519/JSC.0b013e3181e4f7a9
    [21]
    Bryan Dosono and Bryan Semaan. 2019. Moderation Practices as Emotional Labor in Sustaining Online Communities: The Case of AAPI Identity Work on Reddit. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19). Association for Computing Machinery, New York, NY, USA, Article 142, 13 pages. https://doi.org/10.1145/3290605.3300372
    [22]
    Evelyn Douek. 2019. Facebook's 'Oversight Board:' Move Fast with Stable Infrastructure and Humility. N.C. J.L. & Tech 21 (2019), 1--78. Issue 1.
    [23]
    Dmitry Epstein and Gilly Leshed. 2016. The Magic Sauce: Practices of Facilitation in Online Policy Deliberation. Journal of Public Deliberation 12, 1, Article 4 (2016), 29 pages.
    [24]
    European Commission. 2016. Regulation (EU) 2016/679 (General Data Protection Regulation). OJ L 119, 04.05.2016; cor. OJ L 127, 23.5.2018. https://ec.europa.eu/commission/sites/beta-political/files/data-protection-factsheet-changes_en.pdf
    [25]
    Cynthia Farina, Hoi Kong, Cheryl Blake, and Mary Newhart. 2014. Democratic Deliberation in the Wild: The McGill Online Design Studio and the Regulation Room Project. Fordham Urb. L.J. 41 (2014), 1527.
    [26]
    Casey Fiesler and Amy S. Bruckman. 2019. Creativity, Copyright, and Close-Knit Communities: A Case Study of Social Norm Formation and Enforcement. Proc. ACM Hum.-Comput. Interact. 3, GROUP, Article 241 (Dec. 2019), 24 pages. https://doi.org/10.1145/3361122
    [27]
    Casey Fiesler, Shannon Morrison, and Amy S. Bruckman. 2016. An Archive of Their Own: A Case Study of Feminist HCI and Values in Design. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). Association for Computing Machinery, New York, NY, USA, 2574--2585. https://doi.org/10.1145/2858036.2858409
    [28]
    Andrea Forte, Vanesa Larco, and Amy Bruckman. 2009. Decentralization in Wikipedia Governance. Journal of Management Information Systems 26, 1 (2009), 49--72. https://doi.org/10.2753/MIS0742--1222260103
    [29]
    Jesse Fox and Wai Yen Tang. 2017. Women's experiences with general and sexual harassment in online video games: Rumination, organizational responsiveness, withdrawal, and coping strategies. New Media & Society 19, 8 (2017), 1290--1307. https://doi.org/10.1177/1461444816635778
    [30]
    Mary Anne Franks. 2017. 'Revenge Porn' Reform: A View from the Front Lines. Fla. L. Rev. 69 (2017), 1251--1337.
    [31]
    Mary Anne Franks. 2019. The Cult of the Constitution. Stanford University Press, Palo Alto, CA, USA.
    [32]
    Seth Frey, P. M. Krafft, and Brian C. Keegan. 2019. 'This Place Does What It Was Built For': Designing Digital Institutions for Participatory Change. Proc. ACM Hum.-Comput. Interact. 3, CSCW, Article Article 32 (Nov. 2019), 31 pages. https://doi.org/10.1145/3359134
    [33]
    R. Stuart Geiger. 2016. Bot-based collective blocklists in Twitter: the counterpublic moderation of harassment in a networked public space. Information, Communication & Society 19, 6 (2016), 787--803. https://doi.org/10.1080/1369118X.2016.1153700
    [34]
    R Stuart Geiger. 2017. Beyond opening up the black box: Investigating the role of algorithmic systems in Wikipedian organizational culture. Big Data & Society 4, 2 (2017), 1--14. https://doi.org/10.1177/2053951717730735
    [35]
    R. Stuart Geiger and David Ribes. 2010. The Work of Sustaining Order in Wikipedia: The Banning of a Vandal. In Proceedings of the 2010 ACM Conference on Computer Supported Cooperative Work (CSCW '10). ACM, New York, NY, USA, 117--126. https://doi.org/10.1145/1718918.1718941
    [36]
    Dean Gengle. 1981. Communitree (first ed.). The CommuniTree Group, San Francisco, CA, USA.
    [37]
    Ysabel Gerrard. 2018. Beyond the hashtag: Circumventing content moderation on social media. New Media & Society 20, 12 (2018), 4492--4511. https://doi.org/10.1177/1461444818776611
    [38]
    Arpita Ghosh, Satyen Kale, and Preston McAfee. 2011. Who Moderates the Moderators? Crowdsourcing Abuse Detection in User-Generated Content. In Proceedings of the 12th ACM Conference on Electronic Commerce (EC '11). Association for Computing Machinery, New York, NY, USA, 167--176. https://doi.org/10.1145/1993574.1993599
    [39]
    Tarleton Gillespie. 2010. The politics of 'platforms'. New Media & Society 12, 3 (2010), 347--364. https://doi.org/10.1177/1461444809342738
    [40]
    Tarleton Gillespie. 2018. Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions that Shape Social Media. Yale University Press, New Haven, CT, USA.
    [41]
    Roderick Graham and Shawn Smith. 2016. The Content of Our #Characters: Black Twitter as Counterpublic. Sociology of Race and Ethnicity 2, 4 (2016), 433--449. https://doi.org/10.1177/2332649216639067
    [42]
    James Grimmelmann. 2015. The Virtues of Moderation. Yale J.L. & Tech 17 (2015), 42--109.
    [43]
    Tommi Gröndahl, Luca Pajola, Mika Juuti, Mauro Conti, and N. Asokan. 2018. All You Need is 'Love': Evading Hate Speech Detection. In Proceedings of the 11th ACM Workshop on Artificial Intelligence and Security (AISec '18). Association for Computing Machinery, New York, NY, USA, 2--12. https://doi.org/10.1145/3270101.3270103
    [44]
    David Gurzick, Kevin F. White, Wayne G. Lutters, and Lee Boot. 2009. A View from Mount Olympus: The Impact of Activity Tracking Tools on the Character and Practice of Moderation. In Proceedings of the ACM 2009 International Conference on Supporting Group Work (GROUP '09). Association for Computing Machinery, New York, NY, USA, 361--370. https://doi.org/10.1145/1531674.1531727
    [45]
    Aaron Halfaker, R. Stuart Geiger, Jonathan T. Morgan, and John Riedl. 2013. The Rise and Decline of an Open Collaboration System: How Wikipedia's Reaction to Popularity Is Causing Its Decline. American Behavioral Scientist 57, 5 (2013), 664--688. https://doi.org/10.1177/0002764212469365
    [46]
    Natali Helberger, Jo Pierson, and Thomas Poell. 2018. Governing online platforms: From contested to cooperative responsibility. Information Society 34, 1 (2018), 1--14. https://doi.org/10.1080/01972243.2017.1391913
    [47]
    Susan Herring, Kirk Job-Sluder, Rebecca Scheckler, and Sasha Barab. 2002. Searching for Safety Online: Managing 'Trolling' in a Feminist Forum. The Information Society 18, 5 (2002), 371--384. https://doi.org/10.1080/01972240290108186
    [48]
    Starr Roxanne Hiltz and Murray Turoff. 1978. The Network Nation: Human Communication via Computer. Addison- Wesley Publishing Company, Inc., Boston, MA.
    [49]
    Shagun Jhaver, Iris Birman, Eric Gilbert, and Amy Bruckman. 2019. Human-Machine Collaboration for Content Regulation: The Case of Reddit Automoderator. ACM Trans. Comput.-Hum. Interact. 26, 5, Article 31 (July 2019), 35 pages. https://doi.org/10.1145/3338243
    [50]
    Shagun Jhaver, Amy Bruckman, and Eric Gilbert. 2019. Does Transparency in Moderation Really Matter? User Behavior After Content Removal Explanations on Reddit. Proc. ACM Hum.-Comput. Interact. 3, CSCW, Article 150 (Nov. 2019), 27 pages. https://doi.org/10.1145/3359252
    [51]
    Shagun Jhaver, Sucheta Ghoshal, Amy Bruckman, and Eric Gilbert. 2018. Online Harassment and Content Moderation: The Case of Blocklists. ACM Trans. Comput.-Hum. Interact. 25, 2, Article 12 (March 2018), 33 pages. https://doi.org/10.1145/3185593
    [52]
    David R. Johnson and David Post. 1998. The New 'Civic Virtue' of the Internet. First Monday 3, 1 (1998), 18. https://doi.org/10.5210/fm.v3i1.570
    [53]
    Prerna Juneja, Deepika Ramasubramanian, and Tanushree Mitra. 2020. Through the Looking Glass: Study of Transparency in Reddit's Moderation Practices. In Proceedings of the 21st International Conference on Supporting Group Work. ACM, New York, NY, USA.
    [54]
    David Kaye. 2019. Speech Police: The Global Struggle to Govern the Internet. Columbia Global Reports, New York, NY, USA.
    [55]
    Christopher M Kelty. 2008. Two bits: The cultural significance of free software. Duke University Press, Durham, NC, USA.
    [56]
    Charles Kiene, Andrés Monroy-Hernández, and Benjamin Mako Hill. 2016. Surviving an 'Eternal September': How an Online Community Managed a Surge of Newcomers. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 1152--1156. https://doi.org/10.1145/2858036.2858356
    [57]
    Aniket Kittur, Bongwon Suh, Bryan A. Pendleton, and Ed H. Chi. 2007. He Says, She Says: Conflict and Coordination in Wikipedia. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '07). ACM, New York, NY, USA, 453--462. https://doi.org/10.1145/1240624.1240698
    [58]
    Kate Klonick. 2018. The new governors: The people, rules, and processes governing online speech. Harvard Law Review 131 (2018), 1598--1670.
    [59]
    Peter Kollock and Marc Smith. 1996. Managing the Virtual Commons: Cooperation and Conflict in Computer Communities. In Computer-mediated Communication: Linguistic, Social, and Cross-cultural Perspectives, Susan Herring (Ed.). John Benjamins Publishing, Amsterdam, Netherlands, 109--128.
    [60]
    Yubo Kou, Xinning Gui, Shaozeng Zhang, and Bonnie Nardi. 2017. Managing Disruptive Behavior Through Non- Hierarchical Governance: Crowdsourcing in League of Legends and Weibo. Proc. ACM Hum.-Comput. Interact. 1, CSCW, Article 62 (Dec. 2017), 17 pages. https://doi.org/10.1145/3134697
    [61]
    Robert Kraut and Paul Resnick (Eds.). 2012. Building Successful Online Communities: Evidence-based Social Design. MIT Press, Cambridge, MA, USA.
    [62]
    Cliff Lampe and Paul Resnick. 2004. Slash(Dot) and Burn: Distributed Moderation in a Large Online Conversation Space. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '04). ACM, New York, NY, USA, 543--550. https://doi.org/10.1145/985692.985761
    [63]
    Cliff Lampe, Paul Zube, Jusil Lee, Chul Hyun Park, and Erik Johnston. 2014. Crowdsourcing civility: A natural experiment examining the effects of distributed moderation in online forums. Government Information Quarterly 31, 2 (2014), 317--326. https://doi.org/10.1016/j.giq.2013.11.005
    [64]
    Ping Liu, Joshua Guberman, Libby Hemphill, and Aron Culotta. 2018. Forecasting the presence and intensity of hostility on Instagram using linguistic and social features. In Proceedings of the Twelfth International AAAI Conference on Web and Social Media (ICWSM 2018), Vol. 91. AAAI, Menlo Park, CA, USA, 181--190.
    [65]
    Richard MacKinnon. 1997. Virtual Rape. Journal of Computer-Mediated Communication 2, 4 (1997), 1--2. https://doi.org/10.1111/j.1083--6101.1997.tb00200.x
    [66]
    Kaitlin Mahar, Amy X. Zhang, and David Karger. 2018. Squadbox: A Tool to Combat Email Harassment Using Friendsourced Moderation. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18). ACM, New York, NY, USA, Article 586, 13 pages. https://doi.org/10.1145/3173574.3174160
    [67]
    Adrienne Massanari. 2017. #Gamergate and The Fappening: How Reddit's algorithm, governance, and culture support toxic technocultures. New Media & Society 19, 3 (2017), 329--346.
    [68]
    J. Nathan Matias. 2019. The Civic Labor of Volunteer Moderators Online. Social Media + Society 5, 2 (2019), 12. https://doi.org/10.1177/2056305119836778
    [69]
    Chip Morningstar and F Randall Farmer. 1991. The Lessons of Lucasfilm's Habitat. In Cyberspace: First Steps, Michael Benedikt (Ed.). MIT Press, Cambridge, MA, USA, 273--301.
    [70]
    David R. Musicant, Yuqing Ren, James A. Johnson, and John Riedl. 2011. Mentoring in Wikipedia: A Clash of Cultures. In Proceedings of the 7th International Symposium on Wikis and Open Collaboration (WikiSym '11). ACM, New York, NY, USA, 173--182. https://doi.org/10.1145/2038558.2038586
    [71]
    Chaebong Nam. 2019. Behind the interface: Human moderation for deliberative engagement in an eRulemaking discussion. Government Information Quarterly 37, 1, Article 101394 (2019), 13 pages. https://doi.org/10.1016/j.giq.2019.101394
    [72]
    Chikashi Nobata, Joel Tetreault, Achint Thomas, Yashar Mehdad, and Yi Chang. 2016. Abusive Language Detection in Online User Content. In Proceedings of the 25th International Conference on World Wide Web (WWW '16). International World Wide Web Conferences Steering Committee, Republic and Canton of Geneva, Switzerland, 145--153. https://doi.org/10.1145/2872427.2883062
    [73]
    Siobhán O'Mahony and Fabrizio Ferraro. 2007. The Emergence of Governance in an Open Source Community. Academy of Management Journal 50, 5 (2007), 1079--1106. https://doi.org/10.5465/amj.2007.27169153
    [74]
    Elinor Ostrom. 1990. Governing the commons: The evolution of institutions for collective action. Cambridge University Press, Cambridge, UK.
    [75]
    Elinor Ostrom. 2000. Collective Action and the Evolution of Social Norms. Journal of Economic Perspectives 14, 3 (September 2000), 137--158. https://doi.org/10.1257/jep.14.3.137
    [76]
    Elinor Ostrom. 2005. Understanding institutional diversity. Princeton university press, Princeton, NJ, USA.
    [77]
    Elinor Ostrom. 2010. The Future of the Commons. Institute of Economic Affairs, London, England, UK.
    [78]
    Jessica A. Pater, Moon K. Kim, Elizabeth D. Mynatt, and Casey Fiesler. 2016. Characterizations of Online Harassment: Comparing Policies Across Social Media Platforms. In Proceedings of the 19th International Conference on Supporting Group Work (GROUP '16). ACM, New York, NY, USA, 369--374. https://doi.org/10.1145/2957276.2957297
    [79]
    Pew Research Center. 2017, July. Online Harassment 2017. Report. Pew Research Center, Washington, D.C. https://www.pewinternet.org/2017/07/11/online-harassment-2017/
    [80]
    David J Phillips. 1996. Defending the Boundaries: Identifying and Countering Threats in a Usenet Newsgroup. The Information Society 12, 1 (1996), 39--62. https://doi.org/10.1080/019722496129693
    [81]
    Elizabeth Reid. 1999. Hierarchy and Power: Social Control in Cyberspace. In Communities in Cyberspace (1st ed.), Marc A. Smith and P. Kollock (Eds.). Routledge, New York, NY, USA, 107--134.
    [82]
    Howard Rheingold. 1993. The Virtual Community: Homesteading on the Electronic Frontier. AddisonWesley Publishing Company, Boston, MA, USA.
    [83]
    Sarah Roberts. 2018. Digital detritus: 'Error' and the logic of opacity in social media content moderation. First Monday 23, 3 (2018), 9. https://doi.org/10.5210/fm.v23i3.8283
    [84]
    Sarah T. Roberts. 2016. Commercial Content Moderation: Digital Laborers? Dirty Work. In The Intersectional Internet: Race, Sex, Class and Culture Online, Safiya Umoja Noble and Brendesha M. Tynes (Eds.). Peter Lang Digital Formations series, New York, NY, USA, 147--160.
    [85]
    Sarah T Roberts. 2019. Behind the Screen: Content Moderation in the Shadows of Social Media. Yale University Press, New Haven, CT, USA.
    [86]
    Sarita Schoenebeck, Oliver L Haimson, and Lisa Nakamura. 2020. Drawing from justice theories to support targets of online harassment. New Media & Society (2020), 1461444820913122. https://doi.org/10.1177/1461444820913122
    [87]
    Joseph Seering, Tianmi Fang, Luca Damasco, Mianhong 'Cherie' Chen, Likang Sun, and Geoff Kaufman. 2019. Designing User Interface Elements to Improve the Quality and Civility of Discourse in Online Commenting Behaviors. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19). ACM, New York, NY, USA, Article 606, 14 pages. https://doi.org/10.1145/3290605.3300836
    [88]
    Joseph Seering, Juan Pablo Flores, Saiph Savage, and Jessica Hammer. 2018. The Social Roles of Bots: Evaluating Impact of Bots on Discussions in Online Communities. Proc. ACM Hum.-Comput. Interact. 2, CSCW, Article 157 (Nov. 2018), 29 pages. https://doi.org/10.1145/3274426
    [89]
    Joseph Seering, Michal Luria, Connie Ye, Geoff Kaufman, and Jessica Hammer. 2020. It Takes a Village: Integrating an Adaptive Chatbot into an Online Gaming Community. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI '20). ACM, New York, NY, USA, 12. https://doi.org/10.1145/3313831.3376708
    [90]
    Joseph Seering, TonyWang, Jina Yoon, and Geoff Kaufman. 2019. Moderator engagement and community development in the age of algorithms. New Media & Society 21, 7 (2019), 1417--1443. https://doi.org/10.1177/1461444818821316
    [91]
    Aaron Shaw and Benjamin M Hill. 2014. Laboratories of oligarchy? How the iron law extends to peer production. Journal of Communication 64, 2 (2014), 215--238.
    [92]
    Anna DuVal Smith. 1999. Problems of Conflict Management in Virtual Communities. In Communities in Cyberspace (1st ed.), Marc A Smith and P Kollock (Eds.). Routledge, New York, NY, USA, 135--166.
    [93]
    Tim Squirrell. 2019. Platform dialectics: The relationships between volunteer moderators and end users on reddit. New Media & Society 21, 9 (2019), 1910--1927. https://doi.org/10.1177/1461444819834317
    [94]
    Janet Sternberg. 2012. Misbehavior in cyber places: The regulation of online conduct in virtual communities on the Internet. Rowman & Littlefield, Lanham, MD, USA.
    [95]
    Allucquère Rosanne Stone. 1991. Will the Real Body Please Stand Up? In Cyberspace: First Steps, Michael Benedikt (Ed.). MIT Press, Cambridge, MA, USA, 81--118.
    [96]
    Allucquère Rosanne Stone. 1993. What Vampires Know: Transsubjection and Transgender in Cyberspace. Talk Given at the 'In Control: Mensch-Interface-Maschine' Conference in Graz, Austria.
    [97]
    Nicolas P Suzor, Sarah Myers West, Andrew Quodling, and Jillian York. 2019. What Do We Mean When We Talk About Transparency? Toward Meaningful Transparency in Commercial Content Moderation. International Journal of Communication 13 (2019), 1526--1543.
    [98]
    Samuel Hardman Taylor, Dominic DiFranzo, Yoon Hyung Choi, Shruti Sannon, and Natalya N. Bazarova. 2019. Accountability and Empathy by Design: Encouraging Bystander Intervention to Cyberbullying on Social Media. Proc. ACM Hum.-Comput. Interact. 3, CSCW, Article 118 (Nov. 2019), 26 pages. https://doi.org/10.1145/3359220
    [99]
    Tiziana Terranova. 2000. Free labor: Producing culture for the digital economy. Social text 18, 2 (2000), 33--58.
    [100]
    Fred Turner. 2010. From counterculture to cyberculture: Stewart Brand, the Whole Earth Network, and the rise of digital utopianism. University of Chicago Press, Chicago, IL, USA.
    [101]
    Fernanda B Viégas, Martin Wattenberg, and Matthew M McKeon. 2007. The Hidden Order of Wikipedia. In Online Communities and Social Computing, Douglas Schuler (Ed.). Springer Berlin Heidelberg, Berlin, Heidelberg, 445--454.
    [102]
    Jessica Vitak, Kalyani Chadha, Linda Steiner, and Zahra Ashktorab. 2017. Identifying Women's Experiences With and Strategies for Mitigating Negative Effects of Online Harassment. In Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing (CSCW '17). ACM, New York, NY, USA, 1231--1245. https://doi.org/10.1145/2998181.2998337
    [103]
    Sarah Myers West. 2018. Censored, suspended, shadowbanned: User interpretations of content moderation on social media platforms. New Media & Society 20, 11 (2018), 4366--4383. https://doi.org/10.1177/1461444818773059
    [104]
    Kevin Wise, Brian Hamman, and Kjerstin Thorson. 2006. Moderation, Response Rate, and Message Interactivity: Features of Online Communities and Their Effects on Intent to Participate. Journal of Computer-Mediated Communication 12, 1 (10 2006), 24--41. https://doi.org/10.1111/j.1083--6101.2006.00313.x
    [105]
    Donghee YvetteWohn. 2019. Volunteer Moderators in Twitch Micro Communities: How They Get Involved, the Roles They Play, and the Emotional Labor They Experience. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19). ACM, NewYork, NY, USA, Article 160, 13 pages. https://doi.org/10.1145/3290605.3300390
    [106]
    Scott Wright. 2006. Government-run Online Discussion Fora: Moderation, Censorship and the Shadow of Control1. The British Journal of Politics and International Relations 8, 4 (2006), 550--568. https://doi.org/10.1111/j.1467--856x.2006.00247.x
    [107]
    Bingjie Yu, Katta Spiel, Joseph Seering, and Leon Watts. 2020. 'Taking Care of a Fruit Tree': Nurturing as a Layer of Concern in Online Community Moderation. In CHI '20 Extended Abstracts on Human Factors in Computing Systems (CHI EA '20). Association for Computing Machinery, New York, NY, USA, 253:1--9. https://doi.org/10.1145/3334480.3383009

    Cited By

    View all
    • (2024)Challenges in moderating disruptive player behavior in online competitive action gamesFrontiers in Computer Science10.3389/fcomp.2024.12837356Online publication date: 23-Feb-2024
    • (2024)Copyright callouts and the promise of creator-driven platform governanceInternet Policy Review10.14763/2024.2.177013:2Online publication date: 26-Jun-2024
    • (2024)Safety and Community Context: Exploring a Transfeminist Approach to Sapphic Relationship PlatformsProceedings of the ACM on Human-Computer Interaction10.1145/36536948:CSCW1(1-34)Online publication date: 26-Apr-2024
    • Show More Cited By

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image Proceedings of the ACM on Human-Computer Interaction
    Proceedings of the ACM on Human-Computer Interaction  Volume 4, Issue CSCW2
    CSCW
    October 2020
    2310 pages
    EISSN:2573-0142
    DOI:10.1145/3430143
    Issue’s Table of Contents
    This work is licensed under a Creative Commons Attribution International 4.0 License.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 15 October 2020
    Published in PACMHCI Volume 4, Issue CSCW2

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. governance
    2. harassment
    3. hate speech
    4. online communities, moderation
    5. platforms
    6. social networks

    Qualifiers

    • Research-article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)1,354
    • Downloads (Last 6 weeks)178

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Challenges in moderating disruptive player behavior in online competitive action gamesFrontiers in Computer Science10.3389/fcomp.2024.12837356Online publication date: 23-Feb-2024
    • (2024)Copyright callouts and the promise of creator-driven platform governanceInternet Policy Review10.14763/2024.2.177013:2Online publication date: 26-Jun-2024
    • (2024)Safety and Community Context: Exploring a Transfeminist Approach to Sapphic Relationship PlatformsProceedings of the ACM on Human-Computer Interaction10.1145/36536948:CSCW1(1-34)Online publication date: 26-Apr-2024
    • (2024)What Does a Downvote Do? Performing Complementary and Competing Knowledge Practices on an Online PlatformProceedings of the ACM on Human-Computer Interaction10.1145/36536928:CSCW1(1-28)Online publication date: 26-Apr-2024
    • (2024)Combating Islamophobia: Compromise, Community, and Harmony in Mitigating Harmful Online ContentACM Transactions on Social Computing10.1145/36415107:1(1-32)Online publication date: 23-Feb-2024
    • (2024)ReSPect: Enabling Active and Scalable Responses to Networked Online HarassmentProceedings of the ACM on Human-Computer Interaction10.1145/36373948:CSCW1(1-30)Online publication date: 26-Apr-2024
    • (2024)Investigating Influential Users' Responses to Permanent Suspension on Social MediaProceedings of the ACM on Human-Computer Interaction10.1145/36373568:CSCW1(1-41)Online publication date: 26-Apr-2024
    • (2024)Governance Capture in a Self-Governing Community: A Qualitative Comparison of the Croatian, Serbian, Bosnian, and Serbo-Croatian WikipediasProceedings of the ACM on Human-Computer Interaction10.1145/36373388:CSCW1(1-26)Online publication date: 26-Apr-2024
    • (2024)Beyond Initial Removal: Lasting Impacts of Discriminatory Content Moderation to Marginalized Creators on InstagramProceedings of the ACM on Human-Computer Interaction10.1145/36373008:CSCW1(1-28)Online publication date: 26-Apr-2024
    • (2024)Trans-centered moderation: Trans technology creators and centering transness in platform and community governanceProceedings of the 2024 ACM Conference on Fairness, Accountability, and Transparency10.1145/3630106.3658909(326-336)Online publication date: 3-Jun-2024
    • Show More Cited By

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Get Access

    Login options

    Full Access

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media