Abstract
Data generation through crowdsourcing has become a common practice for building or augmenting an Artificial Intelligence (AI) system. These systems often reflect the stereotypical behaviors expressed by humans through the reported data, which can be problematic, especially when dealing with sensitive tasks. One such task is the interpretation of images depicting people. In this work, we evaluate a crowdsourcing approach aimed at identifying the stereotypes conveyed in the collected annotations on people images. By including closed-ended, categorical responses as well as open-ended tags during the data collection phase, we can detect potentially harmful crowd behaviors. Our results suggest a means to assess descriptive tags, as to their alignment with stereotypical beliefs related to gender, age, and body weight. This study concludes with a discussion on how our analytical approach can be applied to pre-existing datasets with similar characteristics or to future knowledge being crowdsourced such as to audit for stereotypes.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
- 3.
- 4.
Sample images can be found at: https://www.chicagofaces.org/.
- 5.
Our research protocol has undergone ethical review and received approval by the Cyprus National Bioethics Committee.
References
Barlas, P., Kyriakou, K., Guest, O., Kleanthous, S., Otterbacher, J.: To “See” is to Stereotype: image tagging algorithms, gender recognition, and the accuracy-fairness trade-off. Proc. ACM Hum.-Comput. Interact. 4(CSCW3), 232:1–232:31 (2021). https://doi.org/10.1145/3432931
Barlas, P., Kyriakou, K., Kleanthous, S., Otterbacher, J.: Social b (eye) as: human and machine descriptions of people images. In: Proceedings of the International AAAI Conference on Web and Social Media, vol. 13, pp. 583–591 (2019)
Chrisler, J.C.: “Why can’t you control yourself?’’ Fat should be a feminist issue. Sex Roles 66(9–10), 608–616 (2012)
Dion, K., Berscheid, E., Walster, E.: What is beautiful is good. J. Pers. Soc. Psychol. 24(3), 285 (1972)
Draws, T., Rieger, A., Inel, O., Gadiraju, U., Tintarev, N.: A checklist to combat cognitive biases in crowdsourcing. In: Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, vol. 9, pp. 48–59 (2021)
Gadiraju, U., Kawase, R., Dietze, S.: A taxonomy of microtasks on the web. In: Proceedings of the 25th ACM Conference on Hypertext and Social Media, pp. 218–223 (2014)
Gadiraju, U., Kawase, R., Dietze, S., Demartini, G.: Understanding malicious behavior in crowdsourcing platforms: The case of online surveys. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 1631–1640 (2015)
Gebru, T., et al.: Datasheets for datasets. Commun. ACM 64(12), 86–92 (2021). https://doi.org/10.1145/3458723, https://dl.acm.org/doi/10.1145/3458723
Hube, C., Fetahu, B., Gadiraju, U.: Understanding and mitigating worker biases in the crowdsourced collection of subjective judgments. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. CHI ’19, pp. 1–12. Association for Computing Machinery, New York, NY, USA (2019). https://doi.org/10.1145/3290605.3300637
Hussain, M., Naz, A., Khan, W., Daraz, U., Khan, Q.: Gender stereotyping in family: an institutionalized and normative mechanism in Pakhtun society of Pakistan. SAGE Open 5(3), 2158244015595258 (2015)
Kamar, E., Kapoor, A., Horvitz, E.: Identifying and accounting for task-dependent bias in crowdsourcing. In: Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, vol. 3 (2015)
Kay, M., Matuszek, C., Munson, S.A.: Unequal representation and gender stereotypes in image search results for occupations. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. CHI ’15, pp. 3819–3828. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702123.2702520
Lay, J.C., Holman, M.R., Bos, A.L., Greenlee, J.S., Oxley, Z.M., Buffett, A.: Time for kids to learn gender stereotypes: analysis of gender and political leadership in a common social studies resource for children. Polit. Gender 17(1), 1–22 (2021)
Leung, W., et al.: Race, gender and beauty: the effect of information provision on online hiring biases. CHI ’20, pp. 1–11. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3313831.3376874
Liang, W., et al.: Advances, challenges and opportunities in creating data for trustworthy AI. Nat. Mach. Intell. 4(8), 669–677 (2022)
Ma, D.S., Correll, J., Wittenbrink, B.: The Chicago face database: a free stimulus set of faces and norming data. Behav. Res. Methods 47(4), 1122–1135 (2015)
Magno, G., Araújo, C.S., Meira Jr., W., Almeida, V.: Stereotypes in search engine results: understanding the role of local and global factors. arXiv preprint arXiv:1609.05413 (2016)
Matsangidou, M., Otterbacher, J.: What is beautiful continues to be good: people images and algorithmic inferences on physical attractiveness. In: Lamas, D., Loizides, F., Nacke, L., Petrie, H., Winckler, M., Zaphiris, P. (eds.) INTERACT 2019. LNCS, vol. 11749, pp. 243–264. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-29390-1_14
Orphanou, K., et al.: Mitigating bias in algorithmic systems-a fish-eye view. ACM Comput. Surv. 55(5) (2022). https://doi.org/10.1145/3527152
Otterbacher, J., Bates, J., Clough, P.: Competent men and warm women: gender stereotypes and backlash in image search results. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 6620–6631. ACM, Denver Colorado USA (2017). https://doi.org/10.1145/3025453.3025727, https://dl.acm.org/doi/10.1145/3025453.3025727
Perikleous, P., et al.: How does the crowd impact the model? A tool for raising awareness of social bias in crowdsourced training data. In: Proceedings of the 31st ACM International Conference on Information and Knowledge Management, pp. 4951–4954 (2022)
Puhl, R.M., Heuer, C.A.: The stigma of obesity: a review and update. Obesity 17(5), 941–964 (2009). https://doi.org/10.1038/oby.2008.636
Seacat, J.D., Mickelson, K.D.: Stereotype threat and the exercise/dietary health intentions of overweight women. J. Health Psychol. 14(4), 556–567 (2009)
Shinners, E.: Effects of the “what is beautiful is good” stereotype on perceived trustworthiness. UW-L J. Undergraduate Res. 12, 1–5 (2009)
Tan, A.S., Li, S., Simpson, C.: American tv and social stereotypes of Americans in Taiwan and Mexico. Journal. Q. 63(4), 809–814 (1986)
Tortajada-Giménez, I., Araüna-Baró, N., Martínez-Martínez, I.J.: Advertising stereotypes and gender representation in social networking sites. Comunicar 21(41), 177–186 (2013)
Veletsianos, G.: Contextually relevant pedagogical agents: visual appearance, stereotypes, and first impressions and their impact on learning. Comput. Educ. 55(2), 576–585 (2010)
Acknowledgments
This project has received funding from the Cyprus Research and Innovation Foundation under grant EXCELLENCE/0421/0360 (KeepA(n)I), the European Union’s Horizon 2020 Research and Innovation Programme under Grant Agreement No. 739578 (RISE), and the Government of the Republic of Cyprus through the Deputy Ministry of Research, Innovation and Digital Policy.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Christoforou, E., Orphanou, K., Kyriacou, M., Otterbacher, J. (2024). A Crowdsourcing Approach for Identifying Potential Stereotypes in the Collected Data. In: Coman, A., Vasilache, S. (eds) Social Computing and Social Media. HCII 2024. Lecture Notes in Computer Science, vol 14703. Springer, Cham. https://doi.org/10.1007/978-3-031-61281-7_1
Download citation
DOI: https://doi.org/10.1007/978-3-031-61281-7_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-61280-0
Online ISBN: 978-3-031-61281-7
eBook Packages: Computer ScienceComputer Science (R0)