Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3643834.3661565acmconferencesArticle/Chapter ViewAbstractPublication PagesdisConference Proceedingsconference-collections
research-article

Labeling in the Dark: Exploring Content Creators’ and Consumers’ Experiences with Content Classification for Child Safety on YouTube

Published: 01 July 2024 Publication History

Abstract

Protecting children’s online privacy is paramount. Online platforms seek to enhance child privacy protection by implementing new classification systems into their content moderation practices. One prominent example is YouTube’s “made for kids” (MFK) classification. However, traditional content moderation focuses on managing content rather than users’ privacy; little is known about how users experience these classification systems. Thematically analyzing online discussions about YouTube’s MFK classification system, we present a case study on content creators’ and consumers’ experiences. We found that creators and consumers perceived MFK classification as misaligned with their actual practices, creators encountered unexpected consequences of practicing labeling, and creators and consumers identified MFK classification’s intersections with other platform designs. Our findings shed light on an interwoven network of multiple classification systems that extends the original focus on child privacy to encompass broader child safety issues; these insights contribute to the design principles of child-centered safety within this intricate network.

References

[1]
Amelia Acker and Leanne Bowler. 2018. Youth Data Literacy: Teen Perspectives on Data Created with Social Media and Mobile Devices. In Proceedings of the Annual Hawaii International Conference on System Sciences. 1923–1932. https://doi.org/10.24251/HICSS.2018.243
[2]
Zainab Agha, Karla Badillo-Urquiola, and Pamela J. Wisniewski. 2023. “Strike at the Root”: Co-designing Real-Time Social Media Interventions for Adolescent Online Risk Prevention. Proc ACM Hum Comput Interact 7, CSCW1 (April 2023), 149. https://doi.org/10.1145/3579625
[3]
Syed Hammad Ahmed, Muhammad Junaid Khan, H. M. Umer Qaisar, and Gita Sukthankar. 2023. Malicious or Benign? Towards Effective Content Moderation for Children’s Videos. In Proceedings of the International Florida Artificial Intelligence Research Society Conference, FLAIRS 36. https://doi.org/10.32473/flairs.36.133315
[4]
Davey Alba. 2015. Google Launches “YouTube Kids,” a New Family-Friendly App. https://www.wired.com/2015/02/youtube-kids/
[5]
Sultan Alshamrani, Ahmed Abusnaina, Mohammed Abuhamad, Daehun Nyang, and David Mohaisen. 2021. Hate, Obscenity, and Insults: Measuring the Exposure of Children to Inappropriate Comments in YouTube. In The Web Conference 2021 - Companion of the World Wide Web Conference, WWW 2021. 508–515. https://doi.org/10.1145/3442442.3452314
[6]
Tawfiq Ammari, Priya Kumar, Cliff Lampe, and Sarita Schoenebeck. 2015. Managing children’s online identities: How parents decide what to disclose about their children online. In Conference on Human Factors in Computing Systems - Proceedings. 1895–1904. https://doi.org/10.1145/2702123.2702325
[7]
Tawfiq Ammari and Sarita Schoenebeck. 2015. Understanding and supporting fathers and fatherhood on social media sites. In Conference on Human Factors in Computing Systems - Proceedings. 1905–1914. https://doi.org/10.1145/2702123.2702205
[8]
Mary Jean Amon, Nika Kartvelishvili, Bennett I. Bertenthal, Kurt Hugenberg, and Apu Kapadia. 2022. Sharenting and Children’s Privacy in the United States: Parenting Style, Practices, and Perspectives on Sharing Young Children’s Photos on Social Media. Proc ACM Hum Comput Interact 6, CSCW1 (April 2022). https://doi.org/10.1145/3512963
[9]
National Archives. 2013. PART 312—Children’s Online Privacy Protection Rule. https://www.ecfr.gov/current/title-16/chapter-I/subchapter-C/part-312
[10]
Karla Badillo-Urquiola, Diva Smriti, Brenna Mcnally, Evan Golub, Elizabeth Bonsignore, and Pamela J Wisniewski. 2019. “Stranger Danger!” Social Media App Features Co-designed with Children to Keep Them Safe Online. In Proc 18th ACM Int Conf Interact Des Child. https://doi.org/10.1145/3311927
[11]
Emmanuelle Bartoli. 2010. Children’s data protection vs marketing companies. International Review of Law, Computers & Technology 23, 1-2 (January 2010), 35–45. https://doi.org/10.1080/13600860902742612
[12]
Stephen Beemsterboer. 2020. COPPA killed the video star: How the YouTube settlement shows that COPPA does more harm than good. https://publish.illinois.edu/illinoisblj/files/2020/06/12-Stephen-COPPA.pdf
[13]
Reuben Binns, Michael Veale, Max Van Kleek, and Nigel Shadbolt. 2017. Like Trainer, Like Bot? Inheritance of Bias in Algorithmic Content Moderation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Vol. 10540. 405–415. https://doi.org/10.1007/978-3-319-67256-4_32
[14]
Lindsay Blackwell, Jill Dimond, Sarita Schoenebeck, and Cliff Lampe. 2017. Classification and its consequences for online harassment: Design insights from HeartMob. Proc ACM Hum Comput Interact 1, CSCW (November 2017), 1–19. https://doi.org/10.1145/3134659
[15]
Lindsay Blackwell, Jean Hardy, Tawfiq Ammari, Tiffany Veinot, Cliff Lampe, and Sarita Schoenebeck. 2016. LGBT parents and social media: Advocacy, privacy, and disclosure during shifting social movements. In Conference on Human Factors in Computing Systems - Proceedings. 610–622. https://doi.org/10.1145/2858036.2858342
[16]
Geoffrey C. Bowker and Susan Leigh Star. 2000. Sorting things out: Classification and its consequences. MIT press.
[17]
Danah Boyd, Eszter Hargittai, Jason Schultz, and John Palfrey. 2011. Why parents help their children lie to Facebook about age: Unintended consequences of the “Children’s Online Privacy Protection Act”. https://firstmonday.org/ojs/index.php/fm/article/download/3850/3075
[18]
Virginia Braun and Victoria Clarke. 2006. Using thematic analysis in psychology. Qual Res Psychol 3, 2 (January 2006), 77–101. https://doi.org/10.1191/1478088706qp063oa
[19]
Virginia Braun and Victoria Clarke. 2019. Reflecting on reflexive thematic analysis. Qual Res Sport Exerc Health 11, 4 (August 2019), 589–597. https://doi.org/10.1080/2159676X.2019.1628806
[20]
Jie Cai, Donghee Yvette Wohn, and Mashael Almoqbel. 2021. Moderation visibility: Mapping the strategies of volunteer moderators in live streaming micro communities. In IMX 2021 - Proceedings of the 2021 ACM International Conference on Interactive Media Experiences. 61–72. https://doi.org/10.1145/3452918.3458796
[21]
Robyn Caplan and Tarleton Gillespie. 2020. Tiered Governance and Demonetization: The Shifting Terms of Labor and Compensation in the Platform Economy. Soc Media Soc 6, 2 (2020). https://doi.org/10.1177/2056305120936636
[22]
Eshwar Chandrasekharan, Shagun Jhaver, Amy Bruckman, and Eric Gilbert. 2022. Quarantined! Examining the Effects of a Community-Wide Moderation Intervention on Reddit. ACM Transactions on Computer-Human Interaction (TOCHI) 29, 4 (March 2022). https://doi.org/10.1145/3490499
[23]
Stuart Cobb. 2020. It’s COPPA-Cated: Protecting Children’s Privacy in the Age of YouTube. https://heinonline.org/HOL/Page?handle=hein.journals/hulr58&id=997&div=&collection=
[24]
Federal Trade Commission. 1998. Children’s Online Privacy Protection Rule (“COPPA”). https://www.ftc.gov/legal-library/browse/rules/childrens-online-privacy-protection-rule-coppa
[25]
Federal Trade Commission. 2019. Musical.ly, Inc.https://www.ftc.gov/news-events/news/press-releases/2019/02/video-social-networking-app-musically-agrees-settle-ftc-allegations-it-violated-childrens-privacy
[26]
Federal Trade Commission. 2023. FTC Proposes Blanket Prohibition Preventing Facebook from Monetizing Youth Data. https://www.ftc.gov/news-events/news/press-releases/2023/05/ftc-proposes-blanket-prohibition-preventing-facebook-monetizing-youth-data
[27]
MacKenzie F. Common. 2020. Fear the Reaper: how content moderation rules are enforced on social media. International Review of Law, Computers & Technology 34, 2 (May 2020), 126–152. https://doi.org/10.1080/13600869.2020.1733762
[28]
Sabrina L. Connell, Alexis R. Lauricella, and Ellen Wartella. 2015. Parental Co-Use of Media Technology with their Young Children in the USA. J Child Media 9, 1 (2015), 5–21. https://doi.org/10.1080/17482798.2015.997440
[29]
Kate Crawford and Tarleton Gillespie. 2016. What is a flag for? Social media reporting tools and the vocabulary of complaint. New Media Soc 18, 3 (March 2016), 410–428. https://doi.org/10.1177/1461444814543163
[30]
Dipto Das, Carsten Østerlund, and Bryan Semaan. 2021. “Jol” or “Pani”?: How Does Governance Shape a Platform’s Identity?. In Proc ACM Hum Comput Interact, Vol. 5. https://doi.org/10.1145/3479860
[31]
Anirudh Ekambaranathan and Jun Zhao. 2021. Money makes the world go around: Identifying barriers to beter privacy in children’s apps from developers’ perspectives. In Conference on Human Factors in Computing Systems - Proceedings. https://doi.org/10.1145/3411764.3445599
[32]
Facebook. 2019. Facebook Community Standards. https://transparency.fb.com/policies/community-standards/
[33]
Federal Trade Commission. 2019. Google and YouTube Will Pay Record $170 Million for Alleged Violations of Children’s Privacy Law. https://www.ftc.gov/news-events/news/press-releases/2019/09/google-youtube-will-pay-record-170-million-alleged-violations-childrens-privacy-law
[34]
Gavin Feller and Benjamin Burroughs. 2021. Branding Kidfluencers: Regulating Content and Advertising on YouTube. Television & New Media 23, 6 (October 2021), 575–592. https://doi.org/10.1177/15274764211052882
[35]
Yang Feng and Wenjing Xie. 2014. Teens’ concern for privacy when using social networking sites: An analysis of socialization agents and relationships with privacy-protecting behaviors. Comput Human Behav 33 (April 2014), 153–162. https://doi.org/10.1016/J.CHB.2014.01.009
[36]
Jessica L. Feuston, Alex S. Taylor, and Anne Marie Piper. 2020. Conformity of Eating Disorders through Content Moderation. Proc ACM Hum Comput Interact 4, CSCW1 (May 2020). https://doi.org/10.1145/3392845
[37]
Shannon Finnegan. 2019. How Facebook Beat the Children’s Online Privacy Protection Act: A Look into the Continued Ineffectiveness of COPPA and How to Hold Social Media Sites Accountable in the Future. https://heinonline.org/HOL/Page?handle=hein.journals/shlr50&id=838&div=&collection=
[38]
Jeremy Gan. 2023. YouTube reportedly dominates competition as top social media platform for children. https://www.dexerto.com/youtube/youtube-reportedly-dominates-competition-as-top-social-media-platform-for-children-2264857/
[39]
GDPR. 2018. General Data Protection Regulation (GDPR). https://gdpr-info.eu/
[40]
Arup Kumar Ghosh, Karla Badillo-Urquiola, Shion Guha, Joseph J. Laviola, and Pamela J. Wisniewski. 2018. Safety vs. surveillance: What children have to say about mobile apps for parental control. In Conference on Human Factors in Computing Systems - Proceedings. https://doi.org/10.1145/3173574.3173698
[41]
Cami Goray and Sarita Schoenebeck. 2022. Youths’ Perceptions of Data Collection in Online Advertising and Social Media. Proc ACM Hum Comput Interact 6, CSCW2 (November 2022). https://doi.org/10.1145/3555576
[42]
Robert Gorwa, Reuben Binns, and Christian Katzenbach. 2020. Algorithmic content moderation: Technical and political challenges in the automation of platform governance. Big Data Soc 7, 1 (January 2020). https://doi.org/10.1177/2053951719897945
[43]
Greg Guest, Emily Namey, and Mario Chen. 2020. A simple method to assess and report thematic saturation in qualitative research. PLoS One 15, 5 (May 2020). https://doi.org/10.1371/JOURNAL.PONE.0232076
[44]
Oliver L. Haimson, Daniel Delmonaco, Peipei Nie, and Andrea Wegner. 2021. Disproportionate Removals and Differing Content Moderation Experiences for Conservative, Transgender, and Black Social Media Users: Marginalization and Moderation Gray Areas. In Proc ACM Hum Comput Interact, Vol. 5. https://doi.org/10.1145/3479610
[45]
Hossein Hosseini, Sreeram Kannan, Baosen Zhang, and Radha Poovendran. 2017. Deceiving Google’s Perspective API Built for Detecting Toxic Comments. https://arxiv.org/abs/1702.08138v1
[46]
Isil Oygur Ilhan, Yunan Chen, and Daniel A. Epstein. 2023. Co-designing for the Co-Use of Child-Owned Wearables. In Proceedings of IDC 2023 - 22nd Annual ACM Interaction Design and Children Conference: Rediscovering Childhood. 603–607. https://doi.org/10.1145/3585088.3593868
[47]
Shagun Jhaver, Darren Scott Appling, Eric Gilbert, and Amy Bruckman. 2019. “Did you suspect the post would be removed?”: Understanding user reactions to content removals on reddit. Proc ACM Hum Comput Interact 3, CSCW (November 2019), 1–33. https://doi.org/10.1145/3359294
[48]
Shagun Jhaver, Iris Birman, Eric Gilbert, and Amy Bruckman. 2019. Human-machine collaboration for content regulation: The case of reddit automoderator. ACM Transactions on Computer-Human Interaction 26, 5 (July 2019), 1–35. https://doi.org/10.1145/3338243
[49]
Shagun Jhaver, Quan Ze Chen, Detlef Knauss, and Amy Zhang. 2022. Designing Word Filter Tools for Creator-led Comment Moderation. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3491102.3517505
[50]
Jialun Aaron Jiang, Charles Kiene, Skyler Middler, Jed R Brubaker, and Casey Fiesler. 2019. Moderation Challenges in Voice-Based Online Communities on Discord. Proc. ACM Hum.-Comput. Interact. 3, CSCW (November 2019). https://doi.org/10.1145/3359157
[51]
Garrett Johnson, Tesary Lin, James C. Cooper, and Liang Zhong. 2024. COPPAcalypse? The Youtube Settlement’s Impact on Kids Content. SSRN Electronic Journal (March 2024). https://doi.org/10.2139/SSRN.4430334
[52]
Anna Kawakami, Venkatesh Sivaraman, Logan Stapleton, Hao Fei Cheng, Adam Perer, Zhiwei Steven Wu, Haiyi Zhu, and Kenneth Holstein. 2022. “Why Do I Care What’s Similar?” Probing Challenges in AI-Assisted Child Welfare Decision-Making through Worker-AI Interface Design Concepts. In DIS 2022 - Proceedings of the 2022 ACM Designing Interactive Systems Conference: Digital Wellbeing. 454–470. https://doi.org/10.1145/3532106.3533556
[53]
Sara Kingsley, Proteeti Sinha, Clara Wang, Motahhare Eslami, and Jason I. Hong. 2022. “Give Everybody a Little Bit More Equity”: Content Creator Perspectives and Responses to the Algorithmic Demonetization of Content Associated with Disadvantaged Groups. Proc ACM Hum Comput Interact 6, CSCW2 (November 2022). https://doi.org/10.1145/3555149
[54]
Susanne Kirchner, Dawn K. Sakaguchi-Tang, Rebecca Michelson, Sean A. Munson, and Julie A. Kientz. 2020. This just felt to me like the right thing to do": Decision-Making Experiences of Parents of Young Children. In DIS 2020 - Proceedings of the 2020 ACM Designing Interactive Systems Conference. 489–503. https://doi.org/10.1145/3357236.3395466
[55]
Priya Kumar, Shalmali Milind Naik, Utkarsha Ramesh Devkar, Marshini Chetty, Tamara L. Clegg, and Jessica Vitak. 2017. “No Telling Passcodes Out Because They’re Private.”. Proc ACM Hum Comput Interact 1, CSCW (December 2017). https://doi.org/10.1145/3134699
[56]
Priya Kumar and Sarita Schoenebeck. 2015. The modern day baby book: Enacting good mothering and stewarding privacy on facebook. In CSCW 2015 - Proceedings of the 2015 ACM International Conference on Computer-Supported Cooperative Work and Social Computing. 1302–1312. https://doi.org/10.1145/2675133.2675149
[57]
Cliff Lampe and Paul Resnick. 2004. Slash(dot) and Burn: Distributed Moderation in a Large Online Conversation Space. In Proceedings of the 2004 conference on Human factors in computing systems - CHI ’04. ACM Press, New York, New York, USA.
[58]
Tianshi Li, Elizabeth Louie, Laura Dabbish, and Jason I. Hong. 2021. How Developers Talk About Personal Data and What It Means for User Privacy. Proc ACM Hum Comput Interact 4, CSCW3 (January 2021), 1–28. https://doi.org/10.1145/3432919
[59]
Sonia Livingstone, Mariya Stoilova, and Rishita Nandagiri. 2019. Children’s data and privacy online: growing up in a digital age: an evidence review. http://www.lse.ac.uk/my-privacy-uk
[60]
Renkai Ma and Yubo Kou. 2021. “How advertiser-friendly is my video?”: YouTuber’s Socioeconomic Interactions with Algorithmic Content Moderation. PACM on Human Computer Interaction 5, CSCW2 (2021), 1–26. https://doi.org/10.1145/3479573
[61]
Renkai Ma and Yubo Kou. 2022. “I am not a YouTuber who can make whatever video I want. I have to keep appeasing algorithms”: Bureaucracy of Creator Moderation on YouTube. https://doi.org/10.1145/3500868.3559445
[62]
Renkai Ma and Yubo Kou. 2022. “I’m not sure what difference is between their content and mine, other than the person itself”: A Study of Fairness Perception of Content Moderation on YouTube. Proc ACM Hum Comput Interact 6, CSCW2 (2022), 28. https://doi.org/10.1145/3555150
[63]
Emily McReynolds, Sarah Hubbard, Timothy Lau, Aditya Saraf, Maya Cakmak, and Franziska Roesner. 2017. Toys that listen: A study of parents, children, and internet-connected toys. In Conference on Human Factors in Computing Systems - Proceedings. 5197–5207. https://doi.org/10.1145/3025453.3025735
[64]
Kathryn C. Montgomery, Jeff Chester, and Tijana Milosevic. 2017. Children’s Privacy in the Big Data Era: Research Opportunities. Pediatrics 140 (November 2017), S117–S121. https://doi.org/10.1542/PEDS.2016-1758O
[65]
Carol Moser, Tianying Chen, and Sarita Y. Schoenebeck. 2017. Parents’ and children’s preferences about parents sharing about children on social media. In Conference on Human Factors in Computing Systems - Proceedings. 5221–5225. https://doi.org/10.1145/3025453.3025587
[66]
Helen Nissenbaum. 2004. Privacy as Contextual Integrity. Washington Law Review 79 (2004). https://heinonline.org/HOL/Page?handle=hein.journals/washlr79&id=129&div=16&collection=journals
[67]
Anna O’Donnell. 2020. Why the VPPA and COPPA Are Outdated: How Netflix, YouTube, and Disney Can Monitor Your Family at No Real Cost. https://heinonline.org/HOL/Page?handle=hein.journals/geolr55&id=471&div=&collection=
[68]
Gwenn Schurgin O’Keeffe, Kathleen Clarke-Pearson, Deborah Ann Mulligan, Tanya Remer Altmann, Ari Brown, Dimitri A. Christakis, Holly Lee Falik, David L. Hill, Marjorie J. Hogan, Alanna Estin Levine, and Kathleen G. Nelson. 2011. The Impact of Social Media on Children, Adolescents, and Families. Pediatrics 127, 4 (April 2011), 800–804. https://doi.org/10.1542/PEDS.2011-0054
[69]
Luci Pangrazio and Neil Selwyn. 2018. “It’s Not Like It’s Life or Death or Whatever”: Young People’s Understandings of Social Media Data. Social Media and Society 4, 3 (July 2018). https://doi.org/10.1177/2056305118787808/ASSET/IMAGES/LARGE/10.1177_2056305118787808-FIG1.JPEG
[70]
Kostantinos Papadamou, Antonis Papasavva, Savvas Zannettou, Jeremy Blackburn, Nicolas Kourtellis, Ilias Leontiadis, Gianluca Stringhini, and Michael Sirivianos. 2020. Disturbed YouTube for Kids: Characterizing and Detecting Inappropriate Videos Targeting Young Children. In Proceedings of the International AAAI Conference on Web and Social Media, Vol. 14. 522–533. https://doi.org/10.1609/ICWSM.V14I1.7320
[71]
Jessica A. Pater, Moon K. Kim, Elizabeth D. Mynatt, and Casey Fiesler. 2016. Characterizations of online harassment: Comparing policies across social media platforms. In Proceedings of the International ACM SIGGROUP Conference on Supporting Group Work. 369–374. https://doi.org/10.1145/2957276.2957297
[72]
Lara Schibelsky Godoy Piccolo, Pinelopi Troullinou, and Harith Alani. 2021. Chatbots to Support Children in Coping with Online Threats: Socio-technical Requirements. In DIS 2021 - Proceedings of the 2021 ACM Designing Interactive Systems Conference: Nowhere and Everywhere. 1504–1517. https://doi.org/10.1145/3461778.3462114
[73]
Dongxiao Qin. 2016. Positionality. The Wiley Blackwell Encyclopedia of Gender and Sexuality Studies (April 2016), 1–2. https://doi.org/10.1002/9781118663219.WBEGSS619
[74]
Irwin Reyes, Primal Wijesekera, Joel Reardon, Amit Elazari Bar On, Abbas Razaghpanah, Narseo Vallina-Rodriguez, and Serge Egelman. 2018. “Won’t Somebody Think of the Children?” Examining COPPA Compliance at Scale. In The 18th Privacy Enhancing Technologies Symposium (PETS 2018). 63–83. https://doi.org/10.1515/popets-2018-0021
[75]
Ivan Rivera. 2019. CRAN - Package RedditExtractoR. https://cran.r-project.org/web/packages/RedditExtractoR/index.html
[76]
Sarah T. Roberts. 2019. Behind the Screen: content moderation in the shadows of social media.
[77]
Barrie Sander. 2019. Freedom of Expression in the Age of Online Platforms: The Promise and Pitfalls of a Human Rights-Based Approach to Content Moderation. Fordham Int Law J (2019).
[78]
Morgan Klaus Scheuerman, Jacob M. Paul, and Jed R. Brubaker. 2019. How computers see gender: An evaluation of gender classification in commercial facial analysis and image labeling services. Proc ACM Hum Comput Interact 3, CSCW (November 2019), 33. https://doi.org/10.1145/3359246
[79]
Joseph Seering, Robert Kraut, and Laura Dabbish. 2017. Shaping Pro and Anti-Social Behavior on Twitch Through Moderation and Example-Setting. In Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing (CSCW ’17). Association for Computing Machinery, New York, NY, USA, 111–125. https://doi.org/10.1145/2998181.2998277
[80]
Peter M. Senge. 1990. The Fifth Discipline: The art and practice of the learning organization. Broadway Business. https://books.google.com/books/about/The_Fifth_Discipline.html?id=wg9DG42quXEC
[81]
Kumar Bhargav Srinivasan, Cristian Danescu-Niculescu-Mizil, Lillian Lee, and Chenhao Tan. 2019. Content removal as a moderation strategy: Compliance and other outcomes in the changemyview community. Proc ACM Hum Comput Interact 3, CSCW (November 2019), 163. https://doi.org/10.1145/3359265
[82]
Kaiwen Sun and Carlo Sugatan. 2021. They see you’re a girl if you pick a pink robot with a skirt: A qualitative study of how children conceptualize data processing and digital privacy risks. In Conference on Human Factors in Computing Systems - Proceedings. https://doi.org/10.1145/3411764.3445333
[83]
TikTok. 2019. TikTok for Younger Users. https://newsroom.tiktok.com/en-us/tiktok-for-younger-users
[84]
Milo Z. Trujillo, Samuel F. Rosenblatt, Anda Jáuregui Guillermo De, Emily Moog, Briane Paul, V. Samson, Laurent Hébert-Dufresne, and Allison M. Roth. 2021. When the Echo Chamber Shatters: Examining the Use of Community-Specific Language Post-Subreddit Ban. https://doi.org/10.48550/arxiv.2106.16207
[85]
Kristen Vaccaro, Christian Sandvig, and Karrie Karahalios. 2020. “At the End of the Day Facebook Does What It Wants”: How Users Experience Contesting Algorithmic Content Moderation. In Proceedings of the ACM on Human-Computer Interaction. 1–22. https://doi.org/10.1145/3415238
[86]
Kristen Vaccaro, Ziang Xiao, Kevin Hamilton, and Karrie Karahalios. 2021. Contestability For Content Moderation. Proc ACM Hum Comput Interact 5, CSCW2 (October 2021), 28. https://doi.org/10.1145/3476059
[87]
Heather Wilson. 2020. YouTube Is Unsafe for Children: YouTube’s Safeguards and the Current Legal Framework Are Inadequate to Protect Children from Disturbing Content. https://heinonline.org/HOL/Page?handle=hein.journals/sjel10&id=237&div=&collection=
[88]
Richard Ashby Wilson and Molly K. Land. 2020. Hate Speech on Social Media: Content Moderation in Context. Conn Law Rev 52 (2020). https://heinonline.org/HOL/Page?handle=hein.journals/conlr52&id=1056&div=28&collection=journals
[89]
Pamela Wisniewski, Heng Xu, Mary Beth Rosson, and John M. Carroll. 2017. Parents just don’t understand: Why teens don’t talk to parents about their online risk experiences. In Proceedings of the ACM Conference on Computer Supported Cooperative Work. 523–540. https://doi.org/10.1145/2998181.2998236
[90]
YouTube. 2023. Age-restricted content. https://support.google.com/youtube/answer/2802167?hl=en
[91]
YouTube. 2023. Determining if your content is “made for kids.”. https://support.google.com/youtube/answer/9528076?hl=en
[92]
YouTube. 2023. Frequently asked questions about “made for kids.”. https://support.google.com/youtube/answer/9684541?hl=en#zippy=%2Chow-do-i-know-if-my-content-is-not-made-for-kids
[93]
YouTube. 2023. Navigate YouTube Studio. https://support.google.com/youtube/answer/7548152?hl=en
[94]
YouTube. 2023. Set your channel or video’s audience. https://support.google.com/youtube/answer/9527654?hl=en&ref_topic=9689353&sjid=16427619472020172874-NA#
[95]
YouTube. 2023. Your YouTube content and Restricted Mode. https://support.google.com/youtube/answer/7354993?hl=en
[96]
YouTube. 2023. YouTube Community Guidelines & Policies. https://www.youtube.com/howyoutubeworks/policies/community-guidelines/
[97]
Leah Zhang-Kennedy, Christine Mekhail, Sonia Chiasson, and Yomna Abdelaziz. 2016. From nosy little brothers to stranger-danger: Children and parents’ perception of mobile threats. In Proceedings of IDC 2016 - The 15th International Conference on Interaction Design and Children. 388–399. https://doi.org/10.1145/2930674.2930716
[98]
Jun Zhao, Ge Wang, Carys Dally, Petr Slovak, Julian Edbrooke-Childs, Max Van Kleek, and Nigel Shadbolt. 2019. ‘I make up a silly name’: Understanding children’s perception of privacy risks online. In Conference on Human Factors in Computing Systems - Proceedings. https://doi.org/10.1145/3290605.3300336

Index Terms

  1. Labeling in the Dark: Exploring Content Creators’ and Consumers’ Experiences with Content Classification for Child Safety on YouTube

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    DIS '24: Proceedings of the 2024 ACM Designing Interactive Systems Conference
    July 2024
    3616 pages
    ISBN:9798400705830
    DOI:10.1145/3643834
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 01 July 2024

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. COPPA
    2. child privacy protection
    3. child safety
    4. content creation
    5. content creator

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    Conference

    DIS '24
    Sponsor:
    DIS '24: Designing Interactive Systems Conference
    July 1 - 5, 2024
    Copenhagen, Denmark

    Acceptance Rates

    Overall Acceptance Rate 1,158 of 4,684 submissions, 25%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 223
      Total Downloads
    • Downloads (Last 12 months)223
    • Downloads (Last 6 weeks)42
    Reflects downloads up to 23 Dec 2024

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media