Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article
Public Access

"How advertiser-friendly is my video?": YouTuber's Socioeconomic Interactions with Algorithmic Content Moderation

Published: 18 October 2021 Publication History
  • Get Citation Alerts
  • Abstract

    To manage user-generated harmful video content, YouTube relies on AI algorithms (e.g., machine learning) in content moderation and follows a retributive justice logic to punish convicted YouTubers through demonetization, a penalty that limits or deprives them of advertisements (ads), reducing their future ad income. Moderation research is burgeoning in CSCW, but relatively little attention has been paid to the socioeconomic implications of YouTube's algorithmic moderation. Drawing from the lens of algorithmic labor, we describe how algorithmic moderation shapes YouTubers' labor conditions through algorithmic opacity and precarity. YouTubers coped with such challenges from algorithmic moderation by sharing and applying practical knowledge they learned about moderation algorithms. By analyzing video content creation as algorithmic labor, we unpack the socioeconomic implications of algorithmic moderation and point to necessary post-punishment support as a form of restorative justice. Lastly, we put forward design considerations for algorithmic moderation systems.

    References

    [1]
    Julia Alexander. 2018. What is YouTube demonetization? An ongoing, comprehensive history. Polygon. Retrieved from https://www.polygon.com/2018/5/10/17268102/youtube-demonetization-pewdiepie-logan-paul-casey-neistat-philip-defranco
    [2]
    Julia Alexander. 2019. The golden age of YouTube is over. The Verge. Retrieved from https://www.theverge.com/2019/4/5/18287318/youtube-logan-paul-pewdiepie-demonetization-adpocalypse-premium-influencers-creators
    [3]
    Julia Alexander. 2019. YouTube says it has 'no obligation' to host anyone's video. The Verge. Retrieved from https://www.theverge.com/2019/11/11/20955864/youtube-terms-of-service-update-terminations-children-content-ftc
    [4]
    Ali Alkhatib and Michael Bernstein. 2019. Street--level algorithms: A theory at the gaps between policy and decisions. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019), Association for Computing Machinery, New York, NY, USA, 1--13.
    [5]
    Maria Antoniak, David Mimno, and Karen Levy. 2019. Narrative paths and negotiation of power in birth stories. Proceedings of the ACM on Human-Computer Interaction 3, 1--27.
    [6]
    Mariam Asad. 2019. Prefigurative design as a method for research justice. Proc. ACM Human-Computer Interact. 3, CSCW (November 2019), 41.
    [7]
    Joan Isaac Biel and Daniel Gatica-Perez. 2011. VlogSense: Conversational behavior and social attention in YouTube. ACM Transactions on Multimedia Computing, Communications and Applications 7 S, 1--21.
    [8]
    Sophie Bishop. 2019. Managing visibility on YouTube through algorithmic gossip. New Media Soc. 21, 11--12 (November 2019), 2589--2606.
    [9]
    Sophie Bishop. 2020. Algorithmic Experts: Selling Algorithmic Lore on YouTube. Soc. Media + Soc. 6, 1 (2020), 205630511989732.
    [10]
    Hannah Bloch-Wehba. 2020. Automation in Moderation. Cornell Int. Law J. 53, (2020), 41.
    [11]
    Glenn A. Bowen. 2008. Naturalistic inquiry and the saturation concept: a research note. Qual. Res. 8, 1 (February 2008), 137--152.
    [12]
    Taina Bucher. 2017. The algorithmic imaginary: exploring the ordinary affects of Facebook algorithms. Inf. Commun. Soc. 20, 1 (January 2017), 30--44.
    [13]
    Robyn Caplan and Tarleton Gillespie. 2020. Tiered Governance and Demonetization: The Shifting Terms of Labor and Compensation in the Platform Economy. Soc. Media + Soc. 6, 2 (2020).
    [14]
    Mark Carman, Mark Koerber, Jiuyong Li, Kim Kwang Raymond Choo, and Helen Ashman. 2018. Manipulating Visibility of Political and Apolitical Threads on Reddit via Score Boosting. In Proceedings - 17th IEEE International Conference on Trust, Security and Privacy in Computing and Communications and 12th IEEE International Conference on Big Data Science and Engineering, Trustcom/BigDataSE 2018, Institute of Electrical and Electronics Engineers Inc., 184--190.
    [15]
    Bobo Chan. 2020. Is Being A YouTuber Still Lucrative? Jumpstart Magazine. Retrieved from https://www.jumpstartmag.com/is-being-a-youtuber-still-lucrative/
    [16]
    Stevie Chancellor, Jessica Pater, Trustin Clear, Eric Gilbert, and Munmun De Choudhury. 2016. #thyghgapp: Instagram Content Moderation and Lexical Variation in Pro-Eating Disorder Communities. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing - CSCW '16, ACM Press, New York, New York, USA.
    [17]
    Minnie Che. 2019. The New Job Market is Online: YouTube. OnLabor. Retrieved from https://onlabor.org/the-new-job-market-is-online-youtube/
    [18]
    Jennifer Cobbe. 2020. Algorithmic Censorship by Social Platforms: Power and Resistance. Philos. Technol. (October 2020), 1--28.
    [19]
    Kelley Cotter. 2019. Playing the visibility game: How digital influencers and algorithms negotiate influence on Instagram. New Media Soc. 21, 4 (April 2019), 895--913.
    [20]
    Kate Crawford and Tarleton Gillespie. 2016. What is a flag for? Social media reporting tools and the vocabulary of complaint. New Media Soc. 18, 3 (March 2016), 410--428.
    [21]
    John B. Davis and Wilfred Dolfsma. 2008. The Elgar Companion to Social Economics.
    [22]
    Caitlin Dewey. 2016. Why YouTubers are accusing the site of rampant ?censorship.' The Washington Post. Retrieved from https://www.washingtonpost.com/news/the-intersect/wp/2016/09/01/why-youtubers-are-accusing-the-site-of-rampant-censorship/
    [23]
    Nicholas Diakopoulos. 2016. Accountability in algorithmic decision making. Commun. ACM 59, 2 (February 2016), 56--62.
    [24]
    Pedro Domingos. 2012. review articles Tapping into the ?folk knowledge" needed to advance machine learning applications. Commun. ACM 55, 10 (2012).
    [25]
    Nicola Döring and M. Rohangis Mohseni. 2020. Gendered hate speech in YouTube and YouNow comments: Results of two content analyses. Stud. Commun. Media 9, 1 (March 2020), 62--88.
    [26]
    Nicola Döring and M Rohangis Mohseni. 2019. Communication Research Reports Fail videos and related video comments on YouTube: a case of sexualization of women and gendered hate speech? Commun. Res. Reports 36, 3 (2019), 254--264.
    [27]
    Finale Doshi-Velez and Been Kim. 2017. Towards A Rigorous Science of Interpretable Machine Learning. arXiv Prepr. (2017). Retrieved from https://arxiv.org/abs/1702.08608v2
    [28]
    Bryan Dosono and Bryan Semaan. 2019. Moderation practices as emotional labor in sustaining online communities: The case of AAPI identity work on reddit. CHI Conf. Hum. Factors Comput. Syst. Proc. (CHI 2019) (May 2019), 1--13.
    [29]
    Brooke Erin Duffy. 2020. Algorithmic precarity in cultural work. Commun. Public 5, 3--4 (September 2020), 103--107.
    [30]
    Stefanie Duguay, Jean Burgess, and Nicolas Suzor. 2020. Queer women's experiences of patchwork platform governance on Tinder, Instagram, and Vine. Converg. Int. J. Res. into New Media Technol. 26, 2 (April 2020), 237--252.
    [31]
    Maria Dykstra. 2019. How to Monetize Your Facebook Video With Facebook Ad Breaks. Social Media Examiner. Retrieved from https://www.socialmediaexaminer.com/how-to-monetize-facebook-video-facebook-ad-breaks/
    [32]
    Motahhare Eslami, Karrie Karahalios, Christian Sandvig, Kristen Vaccaro, Aimee Rickman, Kevin Hamilton, and Alex Kirlik. 2016. First I "like" it, then I hide it: Folk theories of social feeds. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, New York, NY, USA, 2371--2382.
    [33]
    Motahhare Eslami, Kristen Vaccaro, Karrie Karahalios, and Kevin Hamilton. 2017. "Be careful; things can be worse than they appear": Understanding Biased Algorithms and Users' Behavior around Them in Rating Platforms. In ICWSM, 62--71. Retrieved from www.aaai.org
    [34]
    Megan Farokhmanesh. 2018. YouTube is demonetizing some LGBT videos - and adding anti-LGBT ads to others. The Verge. Retrieved from https://www.theverge.com/2018/6/4/17424472/youtube-lgbt-demonetization-ads-algorithm
    [35]
    Joan Feigenbaum, Aaron D. Jaggard, and Rebecca N. Wright. 2011. Towards a formal model of accountability. In Proceedings New Security Paradigms Workshop, ACM Press, New York, New York, USA, 45--55.
    [36]
    Casey Fiesler, Jialun Aaron Jiang, Joshua McCann, Kyle Frye, and Jed R. Brubaker. 2018. Reddit rules! Characterizing an ecosystem of governance. 12th Int. AAAI Conf. Web Soc. Media, ICWSM 2018 (2018), 72--81.
    [37]
    Jacob Gardner and Kevin Lehnert. 2016. What's new about new media? How multi-channel networks work with content creators. Bus. Horiz. 59, 3 (May 2016), 293--302.
    [38]
    Ysabel Gerrard. 2018. Beyond the hashtag: Circumventing content moderation on social media. New Media Soc. 20, 12 (December 2018), 4492--4511.
    [39]
    Tarleton Gillespie. 2018. Custodians of the Internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press. Retrieved from https://www.degruyter.com/document/doi/10.12987/9780300235029/html
    [40]
    Olivia Goldhill. 2020. How YouTube shields advertisers (not viewers) from harmful videos. Retrieved from https://qz.com/1785613/how-youtube-shields-advertisers-not-viewers-from-harmful-videos/
    [41]
    Shraddha Goled. 2020. Online Content Moderation: To AI or Not. Analytics India Magazine. Retrieved January 15, 2021 from https://analyticsindiamag.com/online-content-moderation-to-ai-or-not/
    [42]
    Google. 2021. YouTube Community Guidelines enforcement. Retrieved March 16, 2021 from https://transparencyreport.google.com/youtube-policy/removals?hl=en
    [43]
    Robert Gorwa, Reuben Binns, and Christian Katzenbach. 2020. Algorithmic content moderation: Technical and political challenges in the automation of platform governance. Big Data Soc. 7, 1 (January 2020), 205395171989794.
    [44]
    James Grimmelmann. 2015. The Virtues of Moderation. Yale J. Law Technol. 17, (2015). Retrieved from https://heinonline.org/HOL/Page?handle=hein.journals/yjolt17&id=42&div=&collection=
    [45]
    Simon Niklas Hellmich. 2017. What is socioeconomics? An overview of theories, methods, and themes in the field. Forum Soc. Econ. 46, 1 (2017), 3--25.
    [46]
    Shagun Jhaver, Darren Scott Appling, Eric Gilbert, and Amy Bruckman. 2019. ?Did you suspect the post would be removed?": Understanding user reactions to content removals on reddit. Proc. ACM Human-Computer Interact. 3, CSCW (November 2019), 1--33.
    [47]
    Shagun Jhaver, Iris Birman, Eric Gilbert, and Amy Bruckman. 2019. Human-machine collaboration for content regulation: The case of reddit automoderator. ACM Trans. Comput. Interact. 26, 5 (July 2019), 1--35.
    [48]
    Shagun Jhaver, Amy Bruckman, and Eric Gilbert. 2019. Does transparency in moderation really matter?: User behavior after content removal explanations on reddit. Proc. ACM Human-Computer Interact. 3, CSCW (2019).
    [49]
    Shagun Jhaver, Yoni Karpfen, and Judd Antin. 2018. Algorithmic anxiety and coping strategies of airbnb hosts. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, New York, New York, USA, 1--12.
    [50]
    Lin Jin. 2020. The Creator Economy Needs a Middle Class. Harvard Business Review. Retrieved from https://hbr.org/2020/12/the-creator-economy-needs-a-middle-class
    [51]
    Mark R. Johnson and Jamie Woodcock. 2019. "And Today's Top Donator is": How Live Streamers on Twitch.tv Monetize and Gamify Their Broadcasts. Soc. Media + Soc. 5, 4 (October 2019), 205630511988169.
    [52]
    Prerna Juneja, Deepika Rama Subramanian, and Tanushree Mitra. 2020. Through the looking glass: Study of transparency in Reddit's moderation practices. Proc. ACM Human-Computer Interact. 4, GROUP (January 2020), 1--35.
    [53]
    Rene F. Kizilcec. 2016. How much information? Effects of transparency on trust in an algorithmic interface. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, New York, NY, USA, 2390--2395.
    [54]
    Brian Koerber. 2018. Here's why Logan Paul's video showing suicide is so dangerous. Mashable. Retrieved from https://mashable.com/2018/01/02/logan-paul-suicide-video-explainer/
    [55]
    Adam J. Kolber. 2009. The Subjective Experience of Punishment. Columbia Law Rev. 109, (2009). Retrieved from https://heinonline.org/HOL/Page?handle=hein.journals/clr109&id=186&div=&collection=
    [56]
    Susanne Kopf. 2020. ?Rewarding Good Creators": Corporate Social Media Discourse on Monetization Schemes for Content Creators. Soc. Media + Soc. 6, 4 (October 2020), 205630512096987.
    [57]
    Yubo Kou and Xinning Gui. 2020. Mediating Community-AI Interaction through Situated Explanation. Proc. ACM Human-Computer Interact. 4, CSCW2 (October 2020), 1--27.
    [58]
    Cliff Lampe, Jessica Vitak, Rebecca Gray, and Nicole B. Ellison. 2012. Perceptions of Facebook's value as an information source. In Proceedings of the SIGCHI conference on human factors in computing systems, ACM Press, New York, New York, USA, 3195--3204.
    [59]
    Ralph LaRossa. 2005. Grounded Theory Methods and Qualitative Family Research. J. Marriage Fam. 67, 4 (November 2005), 837--857.
    [60]
    Min Kyung Lee, Daniel Kusbit, Evan Metsky, and Laura Dabbish. 2015. Working with machines: The impact of algorithmic and data-driven management on human workers. In Proceedings of the 2015 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, New York, NY, USA, 1603--1612.
    [61]
    Yong Liu, Jayant Venkatanathan, Jorge Goncalves, Evangelos Karapanos, and Vassilis Kostakos. 2014. Modeling what friendship patterns on facebook reveal about personality and social capital. ACM Trans. Comput. Interact. 21, 3 (June 2014), 1--20.
    [62]
    Emma J Llansó. 2020. No amount of ?AI" in content moderation will solve filtering's prior-restraint problem. Big Data Soc. 7, 1 (January 2020), 205395172092068.
    [63]
    Arwa Mahdawi. 2017. PewDiePie thinks ?Death to all Jews" is a joke. Are you laughing yet? The Guardian. Retrieved from https://www.theguardian.com/commentisfree/2017/feb/15/youtube-pewdiepie-thinks-death-to-all-jews-joke-laughing-yet
    [64]
    D. Michael O'Regan and Jaeyeon Choe. 2017. Airbnb: Turning the Collaborative Economy into a Collaborative Society. . Springer, Cham, 153--168.
    [65]
    Melissa J. Morgans. 2017. Freedom of Speech, the War on Terror, and What's YouTube Got to Do with It: American Censorship during Times of Military Conflict. Fed. Commun. Law J. 69, (2017). Retrieved from https://heinonline.org/HOL/Page?handle=hein.journals/fedcom69&id=163&div=&collection=
    [66]
    Sarah Myers West. 2018. Censored, suspended, shadowbanned: User interpretations of content moderation on social media platforms. New Media Soc. 20, 11 (2018), 4366--4383.
    [67]
    Casey Newton. 2019. The Terror Queue: These moderators help keep Google and YouTube free of violent extremism - and now some of them have PTSD. The Verge. Retrieved from https://www.theverge.com/2019/12/16/21021005/google-youtube-moderators-ptsd-accenture-violent-disturbing-content-interviews-video
    [68]
    Valentin Niebler. 2020. "YouTubers unite': collective action by YouTube content creators. Transf. Eur. Rev. Labour Res. 26, 2 (2020), 223--227.
    [69]
    Abby Ohlheiser. 2018. A week later, YouTube condemns a Logan Paul vlog of a suicide victim's body, says it's looking at ?further consequences.' The Washington Post. Retrieved January 7, 2021 from https://www.washingtonpost.com/news/the-intersect/wp/2018/01/09/a-week-later-youtube-condemns-a-logan-paul-vlog-of-a-suicide-victims-body-says-its-looking-at-further-consequences/
    [70]
    Marie Page. 2017. Facebook Group Member Request Settings: new Facebook feature. The Digiterati. Retrieved January 15, 2021 from https://thedigiterati.com/facebook-group-member-request-settings/
    [71]
    William Clyde Partin. 2020. Bit by (Twitch) Bit: ?Platform Capture" and the Evolution of Digital Platforms. Soc. Media + Soc. 6, 3 (July 2020), 205630512093398.
    [72]
    Patel Sahil. 2017. The ?demonetized": YouTube's brand-safety crackdown has collateral damage. Digiday. Retrieved from https://digiday.com/media/advertisers-may-have-returned-to-youtube-but-creators-are-still-losing-out-on-revenue/
    [73]
    Hector Postigo. 2016. The socio-technical architecture of digital labor: Converting play into YouTube money. New Media Soc. 18, 2 (2016), 332--349.
    [74]
    Emilee Rader, Kelley Cotter, and Janghee Cho. 2018. Explanations as mechanisms for supporting algorithmic transparency. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, New York, NY, USA, 1--13.
    [75]
    Randhawa Kay Lily. 2020. How to Navigate YouTube's Unclear Demonetization System. Superjump. Retrieved from https://medium.com/super-jump/how-to-navigate-youtubes-unclear-demonetization-system-5c437c70e0ae
    [76]
    Noopur Raval and Paul Dourish. 2016. Standing out from the crowd: Emotional labor, body labor, and temporal labor in ridesharing. In Proceedings of the ACM Conference on Computer Supported Cooperative Work, CSCW, Association for Computing Machinery, New York, NY, USA, 97--107.
    [77]
    Ivan Rivera. 2019. CRAN - Package RedditExtractoR. Retrieved from https://cran.r-project.org/web/packages/RedditExtractoR/index.html
    [78]
    Sarah Roberts. 2016. Commercial Content Moderation: Digital Laborers' Dirty Work. Media Stud. Publ. (January 2016). Retrieved from https://ir.lib.uwo.ca/commpub/12
    [79]
    Alex Rosenblat and Luke Stark. 2016. Algorithmic Labor and Information Asymmetries: A Case Study of Uber's Drivers. Int. J. Commun. 10, (2016), 3758--3784.
    [80]
    Koustuv Saha and Munmun De Choudhury. 2017. Modeling stress with social media around incidents of gun violence on college campuses. Proc. ACM Human-Computer Interact. 1, CSCW (November 2017), 1--27.
    [81]
    Sarita Schoenebeck, Oliver L Haimson, and Lisa Nakamura. 2020. Drawing from justice theories to support targets of online harassment. New Media Soc. (March 2020), 146144482091312.
    [82]
    Trebor Scholz. 2012. Digital Labor: The Internet as Playground and Factory. Routledge.
    [83]
    James C. Scott. 1985. Weapons of the Weak: Everyday Forms of Peasant Resistance - James C. Scott - Google Books.
    [84]
    Diana Secara. 2015. The Role of Social Networks in the Work of Terrorist Groups. Research and Science Today, 77--83.
    [85]
    Joseph Seering. 2020. Reconsidering Self-Moderation: the Role of Research in Supporting Community-Based Models for Online Content Moderation. Proc. ACM Human-Computer Interact. 4, CSCW2 (October 2020), 1--28.
    [86]
    Joseph Seering, Tony Wang, Jina Yoon, and Geoff Kaufman. 2019. Moderator engagement and community development in the age of algorithms. New Media Soc. 21, 7 (July 2019), 1417--1443.
    [87]
    Qinlan Shen, Michael Miller Yoder, Yohan Jo, and Carolyn P Rose. 2018. Perceptions of Censorship and Moderation Bias in Political Debate Forums. Int. AAAI Conf. Web Soc. Media; Twelfth Int. AAAI Conf. Web Soc. Media (2018). Retrieved from https://www.aaai.org/ocs/index.php/ICWSM/ICWSM18/paper/view/17809/17026
    [88]
    Ben Shneiderman, Catherine Plaisant, Maxine Cohen, Steven Jacobs, Niklas Elmqvist, and Nicholas Diakopoulos. 2016. Confessions: Grand challenges for HCI researchers. Interactions 23, 24--25.
    [89]
    Spandana Singh. 2019. Everything in Moderation: An Analysis of How Internet Platforms Are Using Artificial Intelligence to Moderate User-Generated Content. Retrieved from https://www.newamerica.org/oti/reports/everything-moderation-analysis-how-internet-platforms-are-using-artificial-intelligence-moderate-user-generated-content/
    [90]
    Emily Stewart. 2019. ?We don't want to be knee-jerk": YouTube responds to Vox on its harassment policies - Vox. Vox. Retrieved from https://www.vox.com/recode/2019/6/10/18660364/vox-youtube-code-conference-susan-wojcicki-carlos-maza
    [91]
    Cynthia Stohl, Michael Stohl, and Paul M. Leonardi. 2016. Digital Age | Managing Opacity: Information Visibility and the Paradox of Transparency in the Digital Age. Int. J. Commun. 10, (2016), 123--137. Retrieved December 28, 2020 from https://ijoc.org/index.php/ijoc/article/view/4466
    [92]
    Nicolas Suzor, Tess Van Geelen, and Sarah Myers West. 2018. Evaluating the legitimacy of platform governance: A review of research and a shared research agenda. Int. Commun. Gaz. 80, 4 (2018), 385--400.
    [93]
    Nicolas P. Suzor, Sarah Myers West, Andrew Quodling, and Jillian York. 2019. What Do We Mean When We Talk About Transparency? Toward Meaningful Transparency in Commercial Content Moderation. Int. J. Commun. 13, (2019). Retrieved from https://ijoc.org/index.php/ijoc/article/view/9736
    [94]
    Twitter. 2015. Securities and Exchange Commission Filing. Retrieved from https://www.sec.gov/Archives/edgar/data/1418091/000156459015004890/twtr-s-3_20150605.htm
    [95]
    Kristen Vaccaro, Christian Sandvig, and Karrie Karahalios. 2020. ?At the End of the Day Facebook Does What It Wants": How Users Experience Contesting Algorithmic Content Moderation. In Proceedings of the ACM on Human-Computer Interaction, Association for Computing Machinery, 1--22.
    [96]
    Michael A. De Vito, Darren Gergle, and Jeremy Birnholtz. 2017. "Algorithms ruin everything": #RIPTwitter, folk theories, and resistance to algorithmic change in social media. In Proceedings of the 2017 CHI conference on human factors in computing systems, Association for Computing Machinery, New York, NY, USA, 3163--3174.
    [97]
    Dakuo Wang, Elizabeth Churchill, Pattie Maes, Xiangmin Fan, Ben Shneiderman, Yuanchun Shi, and Qianying Wang. 2020. From human-human collaboration to Human-AI collaboration: Designing AI systems that can work together with people. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2020), Association for Computing Machinery, New York, NY, USA, 1--6.
    [98]
    Danding Wang, Qian Yang, Ashraf Abdul, and Brian Y. Lim. 2019. Designing theory-driven user-centric explainable AI. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019), Association for Computing Machinery, New York, New York, USA, 1--15.
    [99]
    Michael Wenzel, Tyler G. Okimoto, Norman T. Feather, and Michael J. Platow. 2008. Retributive and restorative justice. Law and Human Behavior 32, 375--389.
    [100]
    Wayne W. Wilkinson and Stephen D. Berry. 2019. Together They Are Troy and Chase: Who Supports Demonetization of Gay Content on YouTube? Psychol. Pop. Media Cult. (2019).
    [101]
    Donghee Yvette Wohn. 2019. Volunteer moderators in twitch micro communities: How they get involved, the roles they play, and the emotional labor they experience. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019), Association for Computing Machinery, New York, New York, USA, 1--13.
    [102]
    Donghee Yvette Wohn, Guo Freeman, and Caitlin McLaughlin. 2018. Explaining viewers' emotional, instrumental, and financial support provision for live streamers. Proc. 2018 CHI Conf. Hum. Factors Comput. Syst. 2018-April, (2018), 1--13.
    [103]
    Lindsey Wotanis and Laurie McMillan. 2014. Performing Gender on YouTube. Fem. Media Stud. 14, 6 (November 2014), 912--928.
    [104]
    Eva Yiwei Wu, Emily Pedersen, and Niloufar Salehi. 2019. Agent, gatekeeper, drug dealer: How content creators craft algorithmic personas. Proc. ACM Human-Computer Interact. 3, CSCW (2019), 1--27.
    [105]
    Katrina Wu. 2016. YouTube Marketing: Legality of Sponsorship and Endorsement in Advertising. J. Law, Bus. Ethics 22, (2016), 59.
    [106]
    Keng Chieh Yang, Chia Hui Huang, Conna Yang, and Su Yu Yang. 2017. Consumer attitudes toward online video advertisement: YouTube as a platform. Kybernetes 46, 5 (2017), 840--853.

    Cited By

    View all
    • (2024)Labeling in the Dark: Exploring Content Creators’ and Consumers’ Experiences with Content Classification for Child Safety on YouTubeProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661565(1518-1532)Online publication date: 1-Jul-2024
    • (2024)Creative Precarity in Motion: Revealing the Hidden Labor Behind Animating Virtual CharactersProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661545(3471-3484)Online publication date: 1-Jul-2024
    • (2024)The Ecology of Harmful Design: Risk and Safety of Game Making on a Metaverse PlatformProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3660678(1842-1856)Online publication date: 1-Jul-2024
    • Show More Cited By

    Index Terms

    1. "How advertiser-friendly is my video?": YouTuber's Socioeconomic Interactions with Algorithmic Content Moderation

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image Proceedings of the ACM on Human-Computer Interaction
      Proceedings of the ACM on Human-Computer Interaction  Volume 5, Issue CSCW2
      CSCW2
      October 2021
      5376 pages
      EISSN:2573-0142
      DOI:10.1145/3493286
      Issue’s Table of Contents
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 18 October 2021
      Published in PACMHCI Volume 5, Issue CSCW2

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. YouTube moderation
      2. YouTuber
      3. algorithmic labor
      4. algorithmic moderation
      5. content moderation
      6. socioeconomics

      Qualifiers

      • Research-article

      Funding Sources

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)772
      • Downloads (Last 6 weeks)56
      Reflects downloads up to 11 Aug 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Labeling in the Dark: Exploring Content Creators’ and Consumers’ Experiences with Content Classification for Child Safety on YouTubeProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661565(1518-1532)Online publication date: 1-Jul-2024
      • (2024)Creative Precarity in Motion: Revealing the Hidden Labor Behind Animating Virtual CharactersProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661545(3471-3484)Online publication date: 1-Jul-2024
      • (2024)The Ecology of Harmful Design: Risk and Safety of Game Making on a Metaverse PlatformProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3660678(1842-1856)Online publication date: 1-Jul-2024
      • (2024)Understanding Phantom Tactile Sensation on Commercially Available Social Virtual Reality PlatformsProceedings of the ACM on Human-Computer Interaction10.1145/36374188:CSCW1(1-22)Online publication date: 26-Apr-2024
      • (2024)Beyond Initial Removal: Lasting Impacts of Discriminatory Content Moderation to Marginalized Creators on InstagramProceedings of the ACM on Human-Computer Interaction10.1145/36373008:CSCW1(1-28)Online publication date: 26-Apr-2024
      • (2024)TS2ACTProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314457:4(1-22)Online publication date: 12-Jan-2024
      • (2024)Third-Party Developers and Tool Development For Community Management on Live Streaming Platform TwitchProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642787(1-18)Online publication date: 11-May-2024
      • (2024)“I’d rather drink in VRChat”: Understanding Drinking in Social Virtual RealityProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642405(1-16)Online publication date: 11-May-2024
      • (2024)"I Got Flagged for Supposed Bullying, Even Though It Was in Response to Someone Harassing Me About My Disability.": A Study of Blind TikTokers’ Content Moderation ExperiencesProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642148(1-15)Online publication date: 11-May-2024
      • (2024)“People are Way too Obsessed with Rank”: Trust System in Social Virtual RealityComputer Supported Cooperative Work (CSCW)10.1007/s10606-024-09498-7Online publication date: 4-May-2024
      • Show More Cited By

      View Options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Get Access

      Login options

      Full Access

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media