Export Citations
Save this search
Please login to be able to save your searches and receive alerts for new content matching your search criteria.
- research-articleJune 2024Best Paper
Public Technologies Transforming Work of the Public and the Public Sector
- Seyun Kim,
- Bonnie Fan,
- Willa Yunqi Yang,
- Jessie Ramey,
- Sarah E Fox,
- Haiyi Zhu,
- John Zimmerman,
- Motahhare Eslami
CHIWORK '24: Proceedings of the 3rd Annual Meeting of the Symposium on Human-Computer Interaction for WorkJune 2024, Article No.: 20, Pages 1–12https://doi.org/10.1145/3663384.3663407Technologies adopted by the public sector have transformed the work practices of employees in public agencies by creating different means of communication and decision-making. Although much of the recent research in the future of work domain has ...
- research-articleJune 2024
‘Your Duties Are To Sweep A Floor Remotely’: Low Information Quality in Job Advertisements is a Barrier to Low-Income Job-Seekers’ Successful Use of Digital Platforms
- Sara Kingsley,
- Michael Six Silberman,
- Clara Wang,
- Robert Lambeth,
- Jiayin Zhi,
- Motahhare Eslami,
- Beibei Li,
- Jeffrey Bigham
CHIWORK '24: Proceedings of the 3rd Annual Meeting of the Symposium on Human-Computer Interaction for WorkJune 2024, Article No.: 18, Pages 1–20https://doi.org/10.1145/3663384.3663403Digital platforms have become central in job search. Job-seekers’ experiences with these platforms, however, is a relatively new research area. This paper presents findings from 27 interviews with US low-income job-seekers. Job-seekers encountered many ...
- research-articleJune 2024
Building, Shifting, & Employing Power: A Taxonomy of Responses From Below to Algorithmic Harm
FAccT '24: Proceedings of the 2024 ACM Conference on Fairness, Accountability, and TransparencyJune 2024, Pages 1093–1106https://doi.org/10.1145/3630106.3658958A large body of research has attempted to ensure that algorithmic systems adhere to notions of fairness and transparency. Increasingly, researchers have highlighted that mitigating algorithmic harms requires explicitly taking power structures into ...
- research-articleJune 2024
The Fall of an Algorithm: Characterizing the Dynamics Toward Abandonment
FAccT '24: Proceedings of the 2024 ACM Conference on Fairness, Accountability, and TransparencyJune 2024, Pages 337–358https://doi.org/10.1145/3630106.3658910As more algorithmic systems have come under scrutiny for their potential to inflict societal harms, an increasing number of organizations that hold power over harmful algorithms have chosen, or were required under the law, to abandon them. While social ...
- extended-abstractMay 2024
Human-Centered Evaluation and Auditing of Language Models
CHI EA '24: Extended Abstracts of the 2024 CHI Conference on Human Factors in Computing SystemsMay 2024, Article No.: 476, Pages 1–6https://doi.org/10.1145/3613905.3636302The recent advancements in Large Language Models (LLMs) have significantly impacted numerous, and will impact more, real-world applications. However, these models also pose significant risks to individuals and society. To mitigate these issues and guide ...
-
- extended-abstractOctober 2023
Supporting User Engagement in Testing, Auditing, and Contesting AI
- Wesley Hanwen Deng,
- Michelle S. Lam,
- Ángel Alexander Cabrera,
- Danaë Metaxa,
- Motahhare Eslami,
- Kenneth Holstein
CSCW '23 Companion: Companion Publication of the 2023 Conference on Computer Supported Cooperative Work and Social ComputingOctober 2023, Pages 556–559https://doi.org/10.1145/3584931.3611279In recent years, interest in directly involving end users in testing, auditing, and contesting AI systems has grown. The involvement of end users, especially from diverse backgrounds, can be essential to overcome AI developers’ blind spots and to ...
- research-articleOctober 2023
The Potential of Diverse Youth as Stakeholders in Identifying and Mitigating Algorithmic Bias for a Future of Fairer AI
Proceedings of the ACM on Human-Computer Interaction (PACMHCI), Volume 7, Issue CSCW2Article No.: 364, Pages 1–27https://doi.org/10.1145/3610213Youth regularly use technology driven by artificial intelligence (AI). However, it is increasingly well-known that AI can cause harm on small and large scales, especially for those underrepresented in tech fields. Recently, users have played active roles ...
- research-articleJune 2023
Investigating Practices and Opportunities for Cross-functional Collaboration around AI Fairness in Industry Practice
FAccT '23: Proceedings of the 2023 ACM Conference on Fairness, Accountability, and TransparencyJune 2023, Pages 705–716https://doi.org/10.1145/3593013.3594037An emerging body of research indicates that ineffective cross-functional collaboration – the interdisciplinary work done by industry practitioners across roles – represents a major barrier to addressing issues of fairness in AI design and development. ...
- research-articleApril 2023
Participation and Division of Labor in User-Driven Algorithm Audits: How Do Everyday Users Work together to Surface Algorithmic Harms?
- Rena Li,
- Sara Kingsley,
- Chelsea Fan,
- Proteeti Sinha,
- Nora Wai,
- Jaimie Lee,
- Hong Shen,
- Motahhare Eslami,
- Jason Hong
CHI '23: Proceedings of the 2023 CHI Conference on Human Factors in Computing SystemsApril 2023, Article No.: 567, Pages 1–19https://doi.org/10.1145/3544548.3582074Recent years have witnessed an interesting phenomenon in which users come together to interrogate potentially harmful algorithmic behaviors they encounter in their everyday lives. Researchers have started to develop theoretical and empirical ...
- research-articleApril 2023Honorable Mention
“I Would Like to Design”: Black Girls Analyzing and Ideating Fair and Accountable AI
CHI '23: Proceedings of the 2023 CHI Conference on Human Factors in Computing SystemsApril 2023, Article No.: 452, Pages 1–14https://doi.org/10.1145/3544548.3581378Artificial intelligence (AI) literacy is especially important for those who may not be well-represented in technology design. We worked with ten Black girls in fifth and sixth grade from a predominantly Black school to understand their perceptions ...
- research-articleApril 2023
Understanding Practices, Challenges, and Opportunities for User-Engaged Algorithm Auditing in Industry Practice
CHI '23: Proceedings of the 2023 CHI Conference on Human Factors in Computing SystemsApril 2023, Article No.: 377, Pages 1–18https://doi.org/10.1145/3544548.3581026Recent years have seen growing interest among both researchers and practitioners in user-engaged approaches to algorithm auditing, which directly engage users in detecting problematic behaviors in algorithmic systems. However, we know little about ...
- research-articleNovember 2022
"Give Everybody [..] a Little Bit More Equity": Content Creator Perspectives and Responses to the Algorithmic Demonetization of Content Associated with Disadvantaged Groups
Proceedings of the ACM on Human-Computer Interaction (PACMHCI), Volume 6, Issue CSCW2Article No.: 424, Pages 1–37https://doi.org/10.1145/3555149Algorithmic systems help manage the governance of digital platforms featuring user-generated content, including how money is distributed to creators from the profits a platform earns from advertising on this content. However, creators producing content ...
- extended-abstractNovember 2022
Who Has an Interest in “Public Interest Technology”?: Critical Questions for Working with Local Governments & Impacted Communities
- Logan Stapleton,
- Devansh Saxena,
- Anna Kawakami,
- Tonya Nguyen,
- Asbjørn Ammitzbøll Flügge,
- Motahhare Eslami,
- Naja Holten Møller,
- Min Kyung Lee,
- Shion Guha,
- Kenneth Holstein,
- Haiyi Zhu
CSCW'22 Companion: Companion Publication of the 2022 Conference on Computer Supported Cooperative Work and Social ComputingNovember 2022, Pages 282–286https://doi.org/10.1145/3500868.3560484Local governments use a wide array of software, algorithms, and data systems across domains such as policing, probation, child protective services, courts, education, public employment services, homelessness services, etc. A growing body of work in ...
- research-articleApril 2022
Toward User-Driven Algorithm Auditing: Investigating users’ strategies for uncovering harmful algorithmic behavior
CHI '22: Proceedings of the 2022 CHI Conference on Human Factors in Computing SystemsApril 2022, Article No.: 626, Pages 1–19https://doi.org/10.1145/3491102.3517441Recent work in HCI suggests that users can be powerful in surfacing harmful algorithmic behaviors that formal auditing approaches fail to detect. However, it is not well understood how users are often able to be so effective, nor how we might support ...
- research-articleApril 2022
Power Dynamics and Value Conflicts in Designing and Maintaining Socio-Technical Algorithmic Processes
Proceedings of the ACM on Human-Computer Interaction (PACMHCI), Volume 6, Issue CSCW1Article No.: 110, Pages 1–21https://doi.org/10.1145/3512957How do power dynamics and value conflicts affect our ability to design and maintain socio-technical algorithmic processes? In this paper, we study the SIGCHI student volunteer (SV) selection process that uses a weighted semi-randomized algorithm to ...
- research-articleOctober 2021
Everyday Algorithm Auditing: Understanding the Power of Everyday Users in Surfacing Harmful Algorithmic Behaviors
Proceedings of the ACM on Human-Computer Interaction (PACMHCI), Volume 5, Issue CSCW2Article No.: 433, Pages 1–29https://doi.org/10.1145/3479577A growing body of literature has proposed formal approaches to audit algorithmic systems for biased and harmful behaviors. While formal auditing approaches have been greatly impactful, they often suffer major blindspots, with critical issues surfacing ...
- research-articleOctober 2021Honorable Mention
Algorithmic Folk Theories and Identity: How TikTok Users Co-Produce Knowledge of Identity and Engage in Algorithmic Resistance
Proceedings of the ACM on Human-Computer Interaction (PACMHCI), Volume 5, Issue CSCW2Article No.: 305, Pages 1–44https://doi.org/10.1145/3476046Algorithms in online platforms interact with users' identities in different ways. However, little is known about how users understand the interplay between identity and algorithmic processes on these platforms, and if and how such understandings shape ...
- keynoteJune 2021
Revisiting Transparency and Fairness in Algorithmic Systems Through the Lens of Public Education and Engagement
L@S '21: Proceedings of the Eighth ACM Conference on Learning @ ScaleJune 2021, Page 13https://doi.org/10.1145/3430895.3462228The power, opacity, and bias of algorithmic systems have opened up new research areas for bringing transparency, fairness, and accountability into these systems. In this talk, I will revisit these lines of work, and argue that while they are critical to ...
- research-articleMay 2019
User Attitudes towards Algorithmic Opacity and Transparency in Online Reviewing Platforms
CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing SystemsMay 2019, Paper No.: 494, Pages 1–14https://doi.org/10.1145/3290605.3300724Algorithms exert great power in curating online information, yet are often opaque in their operation, and even existence. Since opaque algorithms sometimes make biased or deceptive decisions, many have called for increased transparency. However, little ...
- articleApril 2019
Search bias quantification: investigating political bias in social media and web search
- Juhi Kulshrestha,
- Motahhare Eslami,
- Johnnatan Messias,
- Muhammad Bilal Zafar,
- Saptarshi Ghosh,
- Krishna P. Gummadi,
- Karrie Karahalios
Information Retrieval (INFRE), Volume 22, Issue 1-2April 2019, Pages 188–227https://doi.org/10.1007/s10791-018-9341-2Users frequently use search systems on the Web as well as online social media to learn about ongoing events and public opinion on personalities. Prior studies have shown that the top-ranked results returned by these search engines can shape user opinion ...