Abstract
The “Blackbox AI” installation, developed as part of the EthicAI = LABS project, seeks to raise awareness about the social impact and ethical dimension of artificial intelligence (AI). This interdisciplinary installation explores various domains to bring to light the underrepresentation of women in STEM fields and the biases present in AI applications. The gender-swapped stories of women’s experiences of discrimination in the workplace, collected by survey, showcase common patterns and explore the effect of flipping the gender. The text-to-image generation experiment highlights a preference for men in STEM professions and the prevalence of social and racial biases. The facial recognition examples demonstrate the discriminatory effects of such technologies on women, while the image generation investigation poses questions about the influence of AI technology on beauty, with the aim to empower women by pointing out bias in AI tools. The ultimate goal of the project is to challenge visitors to rethink their role in creating our digital future and address the issue of gender bias in artificial intelligence.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Latour, B.: Science in Action – How to Follow Scientists & Engineers Through Society, Harvard University Press (1988)
Mobile cellular subscriptions and Individuals using the internet. World Bank. https://data.worldbank.org/indicator/IT.CEL.SETS. Accessed 2 May 2023
de Fine Licht, K., de Fine Licht, J.: Artificial intelligence, transparency, and public decision-making. AI Soc. 35(4), 917–926 (2020). https://doi.org/10.1007/s00146-020-00960-w
Hunkenschroer, A.L., Luetge, C.: Ethics of AI-enabled recruiting and selection: a review and research agenda. J. Bus. Ethics 178, 977–1007 (2022)
Dastin, J.: Amazon scraps secret AI recruiting tool that showed bias against women. Reuters, San Francisco (2018). https://www.reuters.com/article/us-amazon-com-jobs- automation-insight-idUSKCN1MK08G. Accessed 2 Sept 2023
The Global Gender Gap Report. World Economic Forum, Switzerland (2018). https://www3.weforum.org/docs/WEF_GGGR_2018.pdf. Accessed 2 Sept 2023
The gender pay gap situation in the EU. https://ec.europa.eu/info/policies/justice-and-fundamental-rights/gender-equality/equal-pay/gender-pay-gap-situation-eu_en. Accessed 2 Sept 2023
Vainionppa, F., Kinnula, M., Iivari, N., Molin-Juustila, T.: Girls in IT: intentionally self-excluded or products of high school as a site of exclusion? Internet Res. 31(3), 846–870 (2021)
Miller, R.A., Vaccaro, A., Kimball, E.W., Forester, R.: “It’s dude culture”: Students with minoritized identities of sexuality and/or gender navigating STEM majors. J. Diver. High. Educ. 14(3), 340–352 (2021)
Stonyer, H.: Making engineering students - making women: the discursive context of engineering education. Int. J. Eng. Educ. 18(4), 392–399 (2002)
Tao, K.W., Gloria, A.M.: Should I Stay or Should I Go? The role of Impostorism in STEM persistence. Psychol. Women Q. 43(2), 151–216 (2019)
Clark, S.L., Dyar, C., Inman, E.M., et al.: Women’s career confidence in a fixed, sexist STEM environment. Int. J. STEM Educ. 8, 56 (2021)
Mayring, P.: Qualitative content analysis. forum qualitative sozialforschung/forum: Qual. Soc. Res. 1(2) (2000)
Phipps, A.: Engineering women: the ‘gendering’ of professional identities. Int. J. Eng. Educ. 18(4), 409–414 (2002)
Makarova, E., Aeschlimann, B., Herzog, W.: Why is the pipeline leaking? Experiences of young women in STEM vocational education and training and their adjustment strategies. Empir. Res. Vocation. Educ. Train. 8(1), 1–18 (2016). https://doi.org/10.1186/s40461-016-0027-y
Lam, C.: Female friendships on film: understanding homosocial interaction in gender swapped films. In: The International Encyclopedia of Gender, Media, and Communication, pp. 1–5 (2020)
Williams, Z.: Gender Swapped Fairy Tales review – ‘Handsome and the Beast’ and the ugly brothers. The Guardian (2020). https://www.theguardian.com/books/2020/oct/28/gender-swapped-fairy-tales-review-handsome-and-the-beast-and-the-ugly-brothers. Accessed 2 Oct 2023
Kiffe, J.J.: Gender-Swapped Remakes: Writing a Screenplay to Promote Realistic and Diverse Representation in Female-Centric Films. Portland State University (2020)
Hess, A.: The trouble with Hollywood's gender flips, New York Times. https://www.nytimes.com/2018/06/12/movies/oceans-8-gender-swap.html. Accessed 2 Oct 2023
Henwood, F.: Engineering difference: discourses on gender, sexuality and work in a college of technology. Gend. Educ. 10(1), 35–49 (1998)
Ross, C., Katz, B., Barbu, A.: Measuring social biases in grounded vision and language embeddings. In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 998–1008 (2021)
Cho, J., Zala, A., Bansal, M.: Dall-eval: Probing the reasoning skills and social biases of text-to-image generative transformers. arXiv preprint arXiv:2202.04053 (2022)
Caliskan, A., Bryson, J.J., Narayanan, A.: Semantics derived automatically from language corpora contain human-like biases. Science 356(6334), 183–186 (2017)
Zhao, J., Wang, T., Yatskar, M., Ordonez, V., Chang, K.-W.: Gender bias in coreference resolution: Evaluation and debiasing methods. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, vol. 2, pp.15–20. New Orleans, Louisiana (2018)
Zhao, J., et al.: Gender bias in contextualized word embeddings. NAACL (2019)
Hirota, Y., Nakashima, Y., Garcia, N.: Quantifying societal bias amplification in image captioning. In: 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 13440–13449. New Orleans, LA, USA (2022)
Gal, R., et al.: An Image is Worth One Word: Personalizing Text-to-Image Generation using Textual Inversion (2022)
Bansal, H., Yin, D., Monajatipoor, M., Chang, K-W: how well can text-to-image generative models understand ethical natural language interventions?. In: Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pp. 1358–1370. Abu Dhabi, United Arab Emirates (2022)
Human resources in science and technology. Eurostat. https://ec.europa.eu/eurostat/statistics-explained/index.php?title=Human_resources_in_science_and_technology& oldid=395960. Accessed 10 Feb2023
Ramesh, A., et al.: Zero-shot text-to-image generation. In: International Conference on Machine Learning, pp. 8821–8831. PMLR (2021)
Mishkin, P., Ahmad, L., Brundage, M., Krueger, G., Sastry, G.: DALL· E 2 Preview - Risks and Limitations (2022) https://github.com/openai/dalle-2preview/blob/main/system-card.md. Accessed 9 Feb 2023
Manovich, L.: AI Aesthetics. Strelka press (2018)
Baudrillard, J.: Simulacra and simulations (1981). In: Crime and Media, pp. 69–85. Routledge (2019)
Karras, T., Laine, S., Aittala, M., Hellsten, J., Lehtinen, J., Aila, T.: Analyzing and improving the image quality of stylegan. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 8110–8119 (2020)
Martinez, A.M.: Face recognition, Overview. In: Jain, S. L. (ed) Encyclopedia of Biometrics. Springer, Boston, MA (2009)
Smith, M., Miller, S.: The ethical application of biometric facial recognition technology. AI Soc. , 1–9 (2021). https://doi.org/10.1007/s00146-021-01199-9
Buolamwini, J., Gebru, T.: Gender shades: intersectional accuracy disparities in commercial gender classification. In: Proceedings of the 1st Conference on Fairness, Accountability and Transparency, PMLR 81, pp. 77–91 (2018)
Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA relevance). OJ L 119, 4.5.2016, pp. 1–88
Proposal for a regulation of the European Parliament and of the Council laying down harmonized rules on artificial intelligence (artificial intelligence act) and amending certain union legislative acts. COM/2021/206 final
Acknowledgments
The authors gratefully acknowledge Goethe Institut for funding the BlackboxAI installation developed as part of the 2022 edition of the EthicAI=LAB project. In particular, we would like to express our thanks to our mentors Mihaela Constantinescu, Marinos Koutsomichalis, and Fatih Sinan Esen for their expert guidance and encouragement during the development of this work.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Dobreva, M., Rukavina, T., Stamou, V., Vidaki, A.N., Zacharopoulou, L. (2023). A Multimodal Installation Exploring Gender Bias in Artificial Intelligence. In: Antona, M., Stephanidis, C. (eds) Universal Access in Human-Computer Interaction. HCII 2023. Lecture Notes in Computer Science, vol 14020. Springer, Cham. https://doi.org/10.1007/978-3-031-35681-0_2
Download citation
DOI: https://doi.org/10.1007/978-3-031-35681-0_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-35680-3
Online ISBN: 978-3-031-35681-0
eBook Packages: Computer ScienceComputer Science (R0)