Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Classification and Its Consequences for Online Harassment: Design Insights from HeartMob

Published: 06 December 2017 Publication History
  • Get Citation Alerts
  • Abstract

    Online harassment is a pervasive and pernicious problem. Techniques like natural language processing and machine learning are promising approaches for identifying abusive language, but they fail to address structural power imbalances perpetuated by automated labeling and classification. Similarly, platform policies and reporting tools are designed for a seemingly homogenous user base and do not account for individual experiences and systems of social oppression. This paper describes the design and evaluation of HeartMob, a platform built by and for people who are disproportionately affected by the most severe forms of online harassment. We conducted interviews with 18 HeartMob users, both targets and supporters, about their harassment experiences and their use of the site. We examine systems of classification enacted by technical systems, platform policies, and users to demonstrate how 1) labeling serves to validate (or invalidate) harassment experiences; 2) labeling motivates bystanders to provide support; and 3) labeling content as harassment is critical for surfacing community norms around appropriate user behavior. We discuss these results through the lens of Bowker and Star's classification theories and describe implications for labeling and classifying online abuse. Finally, informed by intersectional feminist theory, we argue that fully addressing online harassment requires the ongoing integration of vulnerable users' needs into the design and moderation of online platforms.

    References

    [1]
    Sara Ahmed. 2017. Living a Feminist Life. Duke University Press.
    [2]
    Julia Angwin. 2017. Facebook's Secret Censorship Rules Protect White Men from Hate Speech But Not Black Children. ProPublica.
    [3]
    Zahra Ashktorab and Jessica Vitak. 2016. Designing Cyberbullying Mitigation and Prevention Solutions Through Participatory Design With Teenagers. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 3895--3905.
    [4]
    Shaowen Bardzell and Jeffrey Bardzell. 2011. Towards a feminist HCI methodology: social science, feminism, and HCI. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11). ACM, New York, NY, USA, 675--684.
    [5]
    Howard Becker. 1963. Outsiders. Glencoe. The Free Press 9: 1982.
    [6]
    Lindsay Blackwell, Jean Hardy, Tawfiq Ammari, Tiffany Veinot, Cliff Lampe, and Sarita Schoenebeck. 2016. LGBT Parents and Social Media: Advocacy, Privacy, and Disclosure During Shifting Social Movements. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 610--622.
    [7]
    Geoffrey C. Bowker and Susan Leigh Star. 2000. Sorting things out: Classification and its consequences. MIT press.
    [8]
    Amy Bruckman, Catalina Danis, Cliff Lampe, Janet Sternberg, and Chris Waldron. 2006. Managing deviant behavior in online communities. In CHI '06 Extended Abstracts on Human Factors in Computing Systems (CHI EA '06). ACM, New York, NY, USA, 21--24.
    [9]
    Eshwar Chandrasekharan, Mattia Samory, Anirudh Srinivasan, and Eric Gilbert. 2017. The Bag of Communities: Identifying Abusive Behavior Online with Preexisting Internet Data. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, New York, NY, USA, 3175--3187.
    [10]
    Justin Cheng, Michael Bernstein, Cristian Danescu-Niculescu-Mizil, and Jure Leskovec. 2017. Anyone Can Become a Troll: Causes of Trolling Behavior in Online Discussions. In Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing (CSCW '17). ACM, New York, NY, USA, 1217--1230.
    [11]
    Robert B. Cialdini. 2007. Descriptive social norms as underappreciated sources of social control. Psychometrika 72, 2: 263.
    [12]
    Danielle Keats Citron and Mary Anne Franks. 2014. Criminalizing Revenge Porn. Social Science Research Network, Rochester, NY.
    [13]
    Rep Katherine Clark. 2015. Online Violence Against Trans Women Perpetuates Dangerous Cycle. Huffington Post.
    [14]
    Keith Collins. 2017. Tech is overwhelmingly white and male, and white men are just fine with that. Quartz.
    [15]
    Patricia Hill Collins. 1993. Black feminist thought in the matrix of domination. Social theory: The multicultural and classic readings: 615--625.
    [16]
    Kimberle Crenshaw. 1991. Mapping the margins: Intersectionality, identity politics, and violence against women of color. Stanford law review: 1241--1299.
    [17]
    Julian Dibbell. 1993. A Rape in Cyberspace. Village Voice XXXVIII, 51.
    [18]
    Jill P. Dimond, Michaelanne Dye, Daphne Larose, and Amy S. Bruckman. 2013. Hollaback!: the role of storytelling online in a social movement organization. In Proceedings of the 2013 conference on Computer supported cooperative work (CSCW '13). ACM, New York, NY, USA, 477--490.
    [19]
    Judith Donath. 1999. Identity and Deception in the Virtual Community. Communities in cyberspace. Psychology Press.
    [20]
    Maeve Duggan. 2014. Online Harassment. Pew Research Center.
    [21]
    Maeve Duggan, 2017. Online Harassment 2017. Pew Research Center.
    [22]
    Noah J. Goldstein, Robert B. Cialdini, and Vladas Griskevicius. 2008. A room with a viewpoint: Using social norms to motivate environmental conservation in hotels. Journal of consumer Research 35, 3: 472--482.
    [23]
    Mark Griffiths. 2002. Occupational health issues concerning Internet use in the workplace. Work & Stress 16, 4: 283--286.
    [24]
    Sandra Harding. 1992. Rethinking standpoint epistemology: What is "strong objectivity"? The Centennial Review 36, 3: 437--470.
    [25]
    Bell Hooks. 2000. Feminism is for everybody: Passionate politics. Pluto Press.
    [26]
    Hossein Hosseini, Sreeram Kannan, Baosen Zhang, and Radha Poovendran. 2017. Deceiving Google's Perspective API Built for Detecting Toxic Comments. arXiv:1702.08138 {cs}.
    [27]
    Sara Kiesler, Robert E. Kraut, Paul Resnick, and Aniket Kittur. 2012. Regulating behavior in online communities. Building Successful Online Communities: Evidence-Based Social Design. MIT Press, Cambridge, MA.
    [28]
    Cliff Lampe. 2014. Gamification and social media. The Gameful world: Approaches, issues, applications. MIT Press, Cambridge, MA.
    [29]
    Cliff Lampe and Paul Resnick. 2004. Slash (dot) and burn: distributed moderation in a large online conversation space. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '04). ACM, New York, NY, USA, 543--550.
    [30]
    Amanda Lenhart, Michelle Ybarra, Kathryn Zickuhr, and Myeshia Prive-Feeney. 2016. Online Harassment, Digital Abuse, and Cyberstalking in America. Data & Society Institute.
    [31]
    Eden Litt and Eszter Hargittai. 2016. The imagined audience on social network sites. Social Media + Society 2, 1: 2056305116633482.
    [32]
    Alice E. Marwick. 2012. The public domain: Social surveillance in everyday life. Surveillance & Society 9, 4: 378.
    [33]
    Karl Marx and Friedrich Engels. 1967. The communist manifesto (1848). Trans. Samuel Moore. London: Penguin.
    [34]
    Leysia Palen and Paul Dourish. 2003. Unpacking privacy for a networked world. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '03). ACM, New York, NY, USA, 129--136.
    [35]
    Michael L. Pittaro. 2007. Cyber stalking: An analysis of online harassment and intimidation. International Journal of Cyber Criminology 1, 2: 180--197.
    [36]
    Lee Rainie, Janna Anderson, and Jonathan Albright. 2017. The Future of Free Speech, Trolls, Anonymity and Fake News Online. Pew Research Center.
    [37]
    Sarita Schoenebeck, Nicole B. Ellison, Lindsay Blackwell, Joseph B. Bayer, and Emily B. Falk. 2016. Playful Backstalking and Serious Impression Management: How Young Adults Reflect on Their Past Identities on Facebook. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing (CSCW '16). ACM, New York, NY, USA, 1475--1487.
    [38]
    Michael D. Shear. 2016. Trump as Cyberbully in Chief? Twitter Attack on Union Boss Draws Fire. The New York Times.
    [39]
    Peter K. Smith, Jess Mahdavi, Manuel Carvalho, Sonja Fisher, Shanette Russell, and Neil Tippett. 2008. Cyberbullying: its nature and impact in secondary school pupils. Journal of Child Psychology and Psychiatry 49, 4: 376--385.
    [40]
    Lee Sproull, Sara Kiesler, and Sara B. Kiesler. 1992. Connections: New Ways of Working in the Networked Organization. MIT Press.
    [41]
    Susan Leigh Star and James R. Griesemer. 1989. Institutional ecology, 'translations' and boundary objects: Amateurs and professionals in Berkeley's Museum of Vertebrate Zoology, 1907--39. Social studies of science 19, 3: 387--420.
    [42]
    Thomas, D. R. (2006). A general inductive approach for analyzing qualitative evaluation data. American journal of evaluation, 27(2), 237--246.
    [43]
    U. S. Department of Labor. 2015. Current Population Survey: Detailed occupation by sex and race. Bureau of Labor Statistics.
    [44]
    Jessica Vitak, Stacy Blasiola, Eden Litt, and Sameer Patil. 2015. Balancing Audience and Privacy Tensions on Social Network Sites: Strategies of Highly Engaged Users. International Journal of Communication 9: 20.
    [45]
    Joseph B. Walther and Malcolm R. Parks. 2002. Cues filtered out, cues filtered in. Handbook of interpersonal communication 3: 529--563.
    [46]
    Christina Warren. 2017. Twitter's New Abuse Filter Works Great, If Your Name Is Mike Pence. Gizmodo.
    [47]
    Jeffrey Weeks. 1999. Discourse, desire and sexual deviance: some problems in a history of homosexuality. Culture, society and sexuality. A reader: 119--42.
    [48]
    Amanda M. Williams and Lilly Irani. 2010. There's methodology in the madness: toward critical HCI ethnography. In CHI '10 Extended Abstracts on Human Factors in Computing Systems (CHI EA '10). ACM, New York, NY, USA, 2725--2734.
    [49]
    Rhiannon Williams. 2014. Facebook's 71 gender options come to UK users. The Telegraph.
    [50]
    Ellery Wulczyn, Nithum Thain, and Lucas Dixon. 2016. Ex Machina: Personal Attacks Seen at Scale. In Proceedings of the 26th International Conference on World Wide Web (WWW '17). International World Wide Web Conferences Steering Committee, Republic and Canton of Geneva, Switzerland, 1391--1399.
    [51]
    Dawei Yin, Zhenzhen Xue, Liangjie Hong, Brian D. Davison, April Kontostathis, and Lynne Edwards. 2009. Detection of harassment on web 2.0. Proceedings of the Content Analysis in the WEB 2: 1--7.

    Cited By

    View all
    • (2024)Forestalling Cyber Bullying and Online HarassmentWearable Devices, Surveillance Systems, and AI for Women's Wellbeing10.4018/979-8-3693-3406-5.ch010(148-181)Online publication date: 8-Mar-2024
    • (2024)Canadian Gender-Based Violence Prevention Programs: Gaps and OpportunitiesViolence Against Women10.1177/10778012241259727Online publication date: 11-Jun-2024
    • (2024)You change the way you talk: Examining the network, toxicity and discourse of cross-platform users on Twitter and Parler during the 2020 US Presidential ElectionJournal of Information Science10.1177/01655515241238405Online publication date: 28-Apr-2024
    • Show More Cited By

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image Proceedings of the ACM on Human-Computer Interaction
    Proceedings of the ACM on Human-Computer Interaction  Volume 1, Issue CSCW
    November 2017
    2095 pages
    EISSN:2573-0142
    DOI:10.1145/3171581
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 06 December 2017
    Published in PACMHCI Volume 1, Issue CSCW

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. bystanders
    2. classification
    3. intersectionality
    4. labeling
    5. moderation
    6. online harassment
    7. social norms
    8. support

    Qualifiers

    • Research-article

    Funding Sources

    • Knight Foundation

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)468
    • Downloads (Last 6 weeks)38
    Reflects downloads up to 10 Aug 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Forestalling Cyber Bullying and Online HarassmentWearable Devices, Surveillance Systems, and AI for Women's Wellbeing10.4018/979-8-3693-3406-5.ch010(148-181)Online publication date: 8-Mar-2024
    • (2024)Canadian Gender-Based Violence Prevention Programs: Gaps and OpportunitiesViolence Against Women10.1177/10778012241259727Online publication date: 11-Jun-2024
    • (2024)You change the way you talk: Examining the network, toxicity and discourse of cross-platform users on Twitter and Parler during the 2020 US Presidential ElectionJournal of Information Science10.1177/01655515241238405Online publication date: 28-Apr-2024
    • (2024)Opportunities, tensions, and challenges in computational approaches to addressing online harassmentProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661623(1483-1498)Online publication date: 1-Jul-2024
    • (2024)Labeling in the Dark: Exploring Content Creators’ and Consumers’ Experiences with Content Classification for Child Safety on YouTubeProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661565(1518-1532)Online publication date: 1-Jul-2024
    • (2024)Stoking the Flames: Understanding Escalation in an Online Harassment CommunityProceedings of the ACM on Human-Computer Interaction10.1145/36410158:CSCW1(1-23)Online publication date: 26-Apr-2024
    • (2024)Fighting for Their Voice: Understanding Indian Muslim Women's Responses to Networked HarassmentProceedings of the ACM on Human-Computer Interaction10.1145/36410058:CSCW1(1-24)Online publication date: 26-Apr-2024
    • (2024)ReSPect: Enabling Active and Scalable Responses to Networked Online HarassmentProceedings of the ACM on Human-Computer Interaction10.1145/36373948:CSCW1(1-30)Online publication date: 26-Apr-2024
    • (2024)AppealMod: Inducing Friction to Reduce Moderator Workload of Handling User AppealsProceedings of the ACM on Human-Computer Interaction10.1145/36372968:CSCW1(1-35)Online publication date: 26-Apr-2024
    • (2024)A Critical Analysis of the Prevalence of Technology-Facilitated Abuse in US College StudentsExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3652036(1-12)Online publication date: 11-May-2024
    • Show More Cited By

    View Options

    Get Access

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media