Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article
Open access

The Potential of Diverse Youth as Stakeholders in Identifying and Mitigating Algorithmic Bias for a Future of Fairer AI

Published: 04 October 2023 Publication History

Abstract

Youth regularly use technology driven by artificial intelligence (AI). However, it is increasingly well-known that AI can cause harm on small and large scales, especially for those underrepresented in tech fields. Recently, users have played active roles in surfacing and mitigating harm from algorithmic bias. Despite being frequent users of AI, youth have been under-explored as potential contributors and stakeholders to the future of AI. We consider three notions that may be at the root of youth facing barriers to playing an active role in responsible AI, which are youth (1) cannot understand the technical aspects of AI, (2) cannot understand the ethical issues around AI, and (3) need protection from serious topics related to bias and injustice. In this study, we worked with youth (N = 30) in first through twelfth grade and parents (N = 6) to explore how youth can be part of identifying algorithmic bias and designing future systems to address problematic technology behavior. We found that youth are capable of identifying and articulating algorithmic bias, often in great detail. Participants suggested different ways users could give feedback for AI that reflects their values of diversity and inclusion. Youth who may have less experience with computing or exposure to societal structures can be supported by peers or adults with more of this knowledge, leading to critical conversations about fairer AI. This work illustrates youths' insights, suggesting that they should be integrated in building a future of responsible AI.

References

[1]
Noor Al-Sibai and Jon Christian. 2022. That AI Image Generator Is Spitting Out Some Awfully Racist Stuff. Futurism (2022).
[2]
Monica Anderson. 2022. Teens, Social Media and Technology 2018. https://www.pewresearch.org/internet/2018/05/31/teens-social-media-technology-2018/
[3]
Golnaz Arastoopour Irgens and JaCoya Thompson. 2020. ?Would You Rather Have it be Accurate or Diverse?" How Male Middle-School Students Make Sense of Algorithm Bias. (2020).
[4]
Catherine Ashcraft, Elizabeth K Eger, and Kimberly A Scott. 2017. Becoming technosocial change agents: Intersectionality and culturally responsive pedagogies as vital resources for increasing girls' participation in computing. Anthropology & Education Quarterly, Vol. 48, 3 (2017), 233--251.
[5]
Joan Palmiter Bajorek. 2019. Voice recognition still has significant race and gender biases. Harvard Business Review, Vol. 10 (2019).
[6]
Paige Hamby Barbeauld. 2014. Don't Say Gay Bills and the Movement to Keep Discussion of LGBT Issues out of Schools. JL & Educ., Vol. 43 (2014), 137.
[7]
Erin Beneteau, Olivia K Richards, Mingrui Zhang, Julie A Kientz, Jason Yip, and Alexis Hiniker. 2019. Communication breakdowns between families and Alexa. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1--13.
[8]
Tolga Bolukbasi, Kai-Wei Chang, James Y Zou, Venkatesh Saligrama, and Adam T Kalai. 2016. Man is to computer programmer as woman is to homemaker? debiasing word embeddings. Advances in neural information processing systems, Vol. 29 (2016).
[9]
Eduardo Bonilla-Silva. 2006. Racism without racists: Color-blind racism and the persistence of racial inequality in the United States. Rowman & Littlefield Publishers.
[10]
Diamond Y Bravo, Julia Jefferies, Avriel Epps, and Nancy E Hill. 2019. When things go viral: Youth's discrimination exposure in the world of social media. In Handbook of children and prejudice. Springer, 269--287.
[11]
Joy Buolamwini and Timnit Gebru. 2018. Gender shades: Intersectional accuracy disparities in commercial gender classification. In Conference on fairness, accountability and transparency. PMLR, 77--91.
[12]
Hao-Fei Cheng, Logan Stapleton, Ruiqi Wang, Paige Bullock, Alexandra Chouldechova, Zhiwei Steven Steven Wu, and Haiyi Zhu. 2021. Soliciting stakeholders' fairness notions in child maltreatment predictive systems. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1--17.
[13]
Sarah Childress. 2014. danah boyd: The Kids Are All Right. PBS Frontline (2014).
[14]
Alexandra Chouldechova, Diana Benavides-Prado, Oleksandr Fialko, and Rhema Vaithianathan. 2018. A case study of algorithm-assisted decision making in child maltreatment hotline screening decisions. In Conference on Fairness, Accountability and Transparency. PMLR, 134--148.
[15]
Merijke Coenraad. 2022. ?That's what techquity is": youth perceptions of technological and algorithmic bias. Information and Learning Sciences ahead-of-print (2022).
[16]
Juliet M Corbin and Anselm Strauss. 1990. Grounded theory research: Procedures, canons, and evaluative criteria. Qualitative sociology, Vol. 13, 1 (1990), 3--21.
[17]
Audun Dahl. 2020. Young children use reason, not gut feelings, to decide moral issues. Psyche (2020).
[18]
Jeffrey Dastin. 2018. Amazon scraps secret AI recruiting tool that showed bias against women. In Ethics of Data and Analytics. Auerbach Publications, 296--299.
[19]
Louise Derman-Sparks and Julie Olsen Edwards. 2019. Understanding anti-bias education. YC Young Children, Vol. 74, 5 (2019), 6--13.
[20]
Michel Désert, Marie Préaux, and Robin Jund. 2009. So young and already victims of stereotype threat: Socio-economic status and performance of 6 to 9 years old children on Raven's progressive matrices. European Journal of Psychology of Education, Vol. 24, 2 (2009), 207--218.
[21]
Alicia DeVos, Aditi Dhabalia, Hong Shen, Kenneth Holstein, and Motahhare Eslami. 2022. Toward User-Driven Algorithm Auditing: Investigating users' strategies for uncovering harmful algorithmic behavior. In CHI Conference on Human Factors in Computing Systems. 1--19.
[22]
Daniella DiPaola, Blakeley H Payne, and Cynthia Breazeal. 2020. Decoding design agendas: an ethical design activity for middle school students. In Proceedings of the interaction design and children conference. 1--10.
[23]
Stefania Druga, Fee Lia Christoph, and Amy J Ko. 2022. Family as a Third Space for AI Literacies: How do children and parents learn about AI together?. In CHI Conference on Human Factors in Computing Systems. 1--17.
[24]
Benj Edwards. 2022. New Meta AI demo writes racist and inaccurate scientific literature, gets pulled. Ars Technica (2022).
[25]
Jan Eichhorn and Johannes Bergh. 2021. Lowering the voting age to 16 in practice: Processes and outcomes compared. Parliamentary Affairs, Vol. 74, 3 (2021), 507--521.
[26]
A Epps-Darling. 2020. How the racism baked into technology hurts teens. The Atlantic (2020).
[27]
Motahhare Eslami, Aimee Rickman, Kristen Vaccaro, Amirhossein Aleyasen, Andy Vuong, Karrie Karahalios, Kevin Hamilton, and Christian Sandvig. 2015. " I always assumed that I wasn't really that close to [her]" Reasoning about Invisible Algorithms in News Feeds. In Proceedings of the 33rd annual ACM conference on human factors in computing systems. 153--162.
[28]
David Hammer and Leema K Berland. 2014. Confusing claims for data: A critique of common practices for presenting qualitative research on learning. Journal of the Learning Sciences, Vol. 23, 1 (2014), 37--46.
[29]
Christina Harrington, Sheena Erete, and Anne Marie Piper. 2019. Deconstructing community-based collaborative design: Towards more equitable participatory design engagements. Proceedings of the ACM on Human-Computer Interaction, Vol. 3, CSCW (2019), 1--25.
[30]
Alex Hern. 2020. Twitter apologises for'racist'image-cropping algorithm. The Guardian (Sept. 2020). https://www. theguardian. com/technology/2020/sep/21/twitter-apologises-for-racist-image-cropping-algorithm (2020).
[31]
Matthew K Hong, Udaya Lakshmi, Kimberly Do, Sampath Prahalad, Thomas Olson, Rosa I Arriaga, and Lauren Wilcox. 2020. Using diaries to probe the illness experiences of adolescent patients and parental caregivers. In Proceedings of the 2020 chi conference on human factors in computing systems. 1--16.
[32]
Matthew K Hong, Lauren Wilcox, Daniel Machado, Thomas A Olson, and Stephen F Simoneaux. 2016. Care partnerships: Toward technology to support teens' participation in their health care. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. 5337--5349.
[33]
Elizabeth Huppert, Jason M Cowell, Yawei Cheng, Carlos Contreras-Ibá nez, Natalia Gomez-Sicard, Maria Luz Gonzalez-Gadea, David Huepe, Agustin Ibanez, Kang Lee, Randa Mahasneh, et al. 2019. The development of children's preferences for equality and equity across 13 individualistic and collectivist cultures. Developmental science, Vol. 22, 2 (2019), e12729.
[34]
Anna Jobin. 2013. Google's autocompletion: algorithms, stereotypes and accountability. sociostrategy. com (2013).
[35]
Jillian J Jordan, Katherine McAuliffe, and Felix Warneken. 2014. Development of in-group favoritism in children's third-party punishment of selfishness. Proceedings of the National Academy of Sciences, Vol. 111, 35 (2014), 12710--12715.
[36]
Bo Ju, Olivia Ravenscroft, Evelyn Flores, Denise Nacu, Sheena Erete, and Nichole Pinkard. 2020. Understanding Parents' Perceived Barriers to Engaging Their Children in Out-of-School STEM Programs. In 2020 Research on Equity and Sustained Participation in Engineering, Computing, and Technology (RESPECT), Vol. 1. IEEE, 1--4.
[37]
Shalini Kantayya. 2020. Coded Bias.
[38]
Phyllis A Katz and Jennifer A Kofkin. 1997. Race, gender, and young children. Developmental psychopathology: Perspectives on adjustment, risk, and disorder, Vol. 21 (1997), 51--74.
[39]
Matthew Kay, Cynthia Matuszek, and Sean A Munson. 2015. Unequal representation and gender stereotypes in image search results for occupations. In Proceedings of the 33rd annual acm conference on human factors in computing systems. 3819--3828.
[40]
Daan Kolkman. 2020. F** k the algorithm?: What the world can learn from the UK's A-level grading fiasco. Impact of Social Sciences Blog (2020).
[41]
Issie Lapowsky. 2018. Google Autocomplete still makes vile suggestions. Wired. URL: GoogleAutocompleteStillMakesVileSuggestions (2018).
[42]
Susan Leavy. 2018. Gender bias in artificial intelligence: The need for diversity and gender theory in machine learning. In Proceedings of the 1st international workshop on gender equality in software engineering. 14--16.
[43]
Irene Lee, Safinah Ali, Helen Zhang, Daniella DiPaola, and Cynthia Breazeal. 2021. Developing Middle School Students' AI Literacy. In Proceedings of the 52nd ACM technical symposium on computer science education. 191--197.
[44]
Steve Lohr. 2018. Facial recognition is accurate, if you're a white guy. In Ethics of Data and Analytics. Auerbach Publications, 143--147.
[45]
Michael Madaio, Lisa Egede, Hariharan Subramonyam, Jennifer Wortman Vaughan, and Hanna Wallach. 2022. Assessing the Fairness of AI Systems: AI Practitioners' Processes, Challenges, and Needs for Support. Proceedings of the ACM on Human-Computer Interaction, Vol. 6, CSCW1 (2022), 1--26.
[46]
Gaspar Isaac Melsión, Ilaria Torre, Eva Vidal, and Iolanda Leite. 2021. Using Explainability to Help Children UnderstandGender Bias in AI. In Interaction Design and Children. 87--99.
[47]
Margaret K Nelson. 2010. Parenting out of control: Anxious parents in uncertain times. NYU Press.
[48]
Natascha Notten and Peter Nikken. 2016. Boys and girls taking risks online: A gendered perspective on social context and adolescents' risky online behavior. New Media & Society, Vol. 18, 6 (2016), 966--988.
[49]
Ziad Obermeyer, Brian W. Powers, Christine Vogeli, and Sendhil Mullainathan. 2019. Dissecting racial bias in an algorithm used to manage the health of populations. Science, Vol. 366 (2019), 447 -- 453.
[50]
Cathy O'neil. 2017. Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.
[51]
Kayur Dushyant Patel. 2012. Lowering the barrier to applying machine learning. University of Washington.
[52]
Sharoda A Paul and Madhu C Reddy. 2010. Understanding together: sensemaking in collaborative information seeking. In Proceedings of the 2010 ACM conference on Computer supported cooperative work. 321--330.
[53]
Jean Piaget. 1965. The stages of the intellectual development of the child. Educational psychology in context: Readings for future teachers, Vol. 63, 4 (1965), 98--106.
[54]
Jennifer Pierre, Roderic Crooks, Morgan Currie, Britt Paris, and Irene Pasquetto. 2021. Getting Ourselves Together: Data-centered participatory design research & epistemic burden. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1--11.
[55]
Kelsey Piper. 2020. Young people have a stake in our future. Let them vote. Vox (2020).
[56]
Michael T Rizzo, Laura Elenbaas, Shelby Cooley, and Melanie Killen. 2016. Children's recognition of fairness and others' welfare in a resource allocation task: Age related changes. Developmental psychology, Vol. 52, 8 (2016), 1307.
[57]
Daniela K Rosner, Saba Kawas, Wenqi Li, Nicole Tilly, and Yi-Chen Sung. 2016. Out of time, out of place: Reflections on design workshops as a research method. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing. 1131--1141.
[58]
Christian Sandvig, Kevin Hamilton, Karrie Karahalios, and Cedric Langbort. 2014. Auditing algorithms: Research methods for detecting discrimination on internet platforms. Data and discrimination: converting critical concerns into productive inquiry, Vol. 22 (2014), 4349--4357.
[59]
P Scanlon. 2020. Voice assistants don't work for kids: The problem with speech recognition in the classroom. TechCrunch (2020).
[60]
Morgan Klaus Scheuerman, Stacy M. Branham, and Foad Hamidi. 2018. Safe Spaces and Safe Places. Proceedings of the ACM on Human-Computer Interaction, Vol. 2 (2018), 1 -- 27.
[61]
Marco FH Schmidt, Margarita Svetlova, Jana Johe, and Michael Tomasello. 2016. Children's developing understanding of legitimate reasons for allocating resources unequally. Cognitive Development, Vol. 37 (2016), 42--52.
[62]
Deborah Seehorn and Lissa Clayborn. 2017. CSTA K-12 CS standards for all. In Proceedings of the 2017 ACM SIGCSE Technical Symposium on Computer Science Education. 730--730.
[63]
Hong Shen, Alicia DeVos, Motahhare Eslami, and Kenneth Holstein. 2021. Everyday algorithm auditing: Understanding the power of everyday users in surfacing harmful algorithmic behaviors. Proceedings of the ACM on Human-Computer Interaction, Vol. 5, CSCW2 (2021), 1--29.
[64]
Zoe Skinner, Stacey Brown, and Greg Walsh. 2020. Children of Color's Perceptions of Fairness in AI: An Exploration of Equitable and Inclusive Co-Design. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. 1--8.
[65]
Judith G Smetana, Marc Jambon, Clare Conry-Murray, and Melissa L Sturge-Apple. 2012. Reciprocal associations between young children's developing moral judgments and theory of mind. Developmental psychology, Vol. 48, 4 (2012), 1144.
[66]
Jaemarie Solyst, Alexis Axon, Angela E.B. Stewart, Motahhare Eslami, and Amy Ogan. 2022a. Investigating Girls' Perspectives and Knowledge Gaps on Ethics and Fairness in Artificial Intelligence in a Lightweight Workshop. International Society of the Learning Sciences (ISLS) (2022).
[67]
Jaemarie Solyst, Shixian Xie, Ellia Yang, Angela E.B. Stewart, Motahhare Eslami, Jessica Hammer, and Amy Ogan. 2023. "I Would Like to Design": Black Girls Analyzing and Ideating Fair and Accountable AI. Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (2023).
[68]
Jaemarie Solyst, Laura Yao, Alexis Axon, and Amy Ogan. 2022b. " It is the Future": Exploring Parent Perspectives of CS Education. In Proceedings of the 53rd ACM Technical Symposium on Computer Science Education V. 1. 258--264.
[69]
Jessica A Sommerville. 2018. Infants' understanding of distributive fairness as a test case for identifying the extents and limits of infants' sociomoral cognition and behavior. Child Development Perspectives, Vol. 12, 3 (2018), 141--145.
[70]
United States. [n.,d.]. Right to Vote at Age 18. US Constitution, Amendment 26 ( [n.,d.]).
[71]
Lauren Strapagiel. 2020. This researcher's observation shows the uncomfortable bias of TikTok's algorithm. https://www.buzzfeednews.com/article/laurenstrapagiel/tiktok-algorithim-racial-bias
[72]
Angie Thomas. 2017. The hate u give. Gyldendal A/S.
[73]
Greta Thunberg. 2019. No one is too small to make a difference. Penguin.
[74]
David Touretzky, Christina Gardner-McCune, Fred Martin, and Deborah Seehorn. 2019. Envisioning AI for K-12: What should every child know about AI?. In Proceedings of the AAAI conference on artificial intelligence, Vol. 33. 9795--9799.
[75]
Mike Van Duuren, Barbara Dossett, and Dawn Robinson. 1998. Gauging children's understanding of artificially intelligent objects: a presentation of ?counterfactuals". International Journal of Behavioral Development, Vol. 22, 4 (1998), 871--889.
[76]
Rosalie Waelen and Micha? Wieczorek. 2022. The Struggle for AI's Recognition: Understanding the Normative Implications of Gender Bias in AI with Honneth's Theory of Recognition. Philosophy & Technology, Vol. 35 (2022), 1--17.
[77]
Ge Wang, Jun Zhao, Max Van Kleek, and Nigel Shadbolt. 2022. Informing Age-Appropriate AI: Examining Principles and Practices of AI for Children. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 1--29.
[78]
Jennifer Wang and Sepehr Hejazi Moghadam. 2017. Diversity barriers in K-12 computer science education: Structural and social. In Proceedings of the 2017 ACM SIGCSE technical symposium on computer science education. 615--620.
[79]
David Wendler, Jonathan E Rackoff, Ezekiel J Emanuel, and Christine Grady. 2002. The ethics of paying for children's participation in research. The Journal of pediatrics, Vol. 141, 2 (2002), 166--171.
[80]
Randi Williams, Stephen P Kaputsos, and Cynthia Breazeal. 2021. Teacher Perspectives on How To Train Your Robot: A Middle School AI and Ethics Curriculum. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 35. 15678--15686.
[81]
Oren Zuckerman and Mitchel Resnick. 2005. Children's Misconceptions as Barriers to the Learning of Systems Concepts. (2005).

Cited By

View all
  • (2024)Ethical Governance of Artificial Intelligence: Guiding Youth towards Responsible Digital CitizenshipJournal of Policy Research10.61506/02.0029110:2(726-733)Online publication date: 1-Jun-2024
  • (2024)“Show Them the Playbook That These Companies Are Using”: Youth Voices about Why Computer Science Education Must Center Discussions of Power, Ethics, and Culturally Responsive ComputingACM Transactions on Computing Education10.1145/366064524:3(1-21)Online publication date: 23-Apr-2024
  • (2024)Augmenting Youths' Critical Consciousness Through Redesign of Algorithmic SystemsProceedings of the 2024 ACM Conference on International Computing Education Research - Volume 210.1145/3632621.3671425(535-536)Online publication date: 12-Aug-2024
  • Show More Cited By

Index Terms

  1. The Potential of Diverse Youth as Stakeholders in Identifying and Mitigating Algorithmic Bias for a Future of Fairer AI

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image Proceedings of the ACM on Human-Computer Interaction
        Proceedings of the ACM on Human-Computer Interaction  Volume 7, Issue CSCW2
        CSCW
        October 2023
        4055 pages
        EISSN:2573-0142
        DOI:10.1145/3626953
        Issue’s Table of Contents
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 04 October 2023
        Published in PACMHCI Volume 7, Issue CSCW2

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. adolescents
        2. algorithm auditing
        3. computing education
        4. fair AI
        5. fate
        6. k-12
        7. responsible AI
        8. workshop
        9. youth

        Qualifiers

        • Research-article

        Funding Sources

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)891
        • Downloads (Last 6 weeks)110
        Reflects downloads up to 01 Sep 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)Ethical Governance of Artificial Intelligence: Guiding Youth towards Responsible Digital CitizenshipJournal of Policy Research10.61506/02.0029110:2(726-733)Online publication date: 1-Jun-2024
        • (2024)“Show Them the Playbook That These Companies Are Using”: Youth Voices about Why Computer Science Education Must Center Discussions of Power, Ethics, and Culturally Responsive ComputingACM Transactions on Computing Education10.1145/366064524:3(1-21)Online publication date: 23-Apr-2024
        • (2024)Augmenting Youths' Critical Consciousness Through Redesign of Algorithmic SystemsProceedings of the 2024 ACM Conference on International Computing Education Research - Volume 210.1145/3632621.3671425(535-536)Online publication date: 12-Aug-2024
        • (2024)Youth as Peer Auditors: Engaging Teenagers with Algorithm Auditing of Machine Learning ApplicationsProceedings of the 23rd Annual ACM Interaction Design and Children Conference10.1145/3628516.3655752(560-573)Online publication date: 17-Jun-2024
        • (2024)Children’s sensemaking of algorithms and data flows across YouTube and social mediaInformation and Learning Sciences10.1108/ILS-12-2023-0201Online publication date: 1-Aug-2024

        View Options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Get Access

        Login options

        Full Access

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media