Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3630106.3658915acmotherconferencesArticle/Chapter ViewAbstractPublication PagesfacctConference Proceedingsconference-collections
research-article
Open access

No Simple Fix: How AI Harms Reflect Power and Jurisdiction in the Workplace

Published: 05 June 2024 Publication History

Abstract

The introduction of AI into working processes has resulted in workers increasingly being subject to AI-related harms. By analyzing incidents of worker-related AI harms between 2008 and 2023 in the AI Incident Database, we find that harms get addressed under considerably restricted scenarios. Results from a Qualitative Comparative Analysis (QCA) show that workers with more power resources, either in the form of expertise or labor market power, have a greater likelihood of seeing harms fixed, all else equal. By contrast, workers lacking expertise or labor market power, have lower success rates and must resort to legal or regulatory mechanisms to get fixes through. These findings suggest that the workplace is another arena in which AI has the potential to reproduce existing inequalities among workers and that stronger legal frameworks and regulations can empower more vulnerable worker populations.

References

[1]
Andrew Delano Abbott. 1988. The system of professions: an essay on the division of expert labor. University of Chicago Press, Chicago.
[2]
Daron Acemoglu. 2021. Harms of AI. National Bureau of Economic Research, Cambridge, MA. https://doi.org/10.3386/w29247
[3]
Micah Altman, Alexandra Wood, and Effy Vayena. 2018. A Harm-Reduction Framework for Algorithmic Fairness. IEEE Secur. Privacy 16, 3 (May 2018), 34–45. https://doi.org/10.1109/MSP.2018.2701149
[4]
Dario Amodei, Chris Olah, Jacob Steinhardt, Paul Christiano, John Schulman, and Dan Mané. 2016. Concrete Problems in AI Safety. (2016). https://doi.org/10.48550/ARXIV.1606.06565
[5]
Dani Anguiano and Lois Beckett. 2023. How Hollywood writers triumphed over AI – and why it matters. The Guardian. Retrieved November 2, 2023 from https://www.theguardian.com/culture/2023/oct/01/hollywood-writers-strike-artificial-intelligence
[6]
Grace Augustine. 2021. We're Not Like Those Crazy Hippies: The Dynamics of Jurisdictional Drift in Externally Mandated Occupational Groups. Organization Science 32, 4 (July 2021), 1056–1078. https://doi.org/10.1287/orsc.2020.1423
[7]
Ruha Benjamin. 2019. Race after technology: abolitionist tools for the New Jim Code. Polity, Cambridge, UK Medford, MA.
[8]
William Boag, Harini Suresh, Bianca Lepe, and Catherine D'Ignazio. 2022. Tech Worker Organizing for Power and Accountability. In 2022 ACM Conference on Fairness, Accountability, and Transparency, June 21, 2022. ACM, Seoul Republic of Korea, 452–463. https://doi.org/10.1145/3531146.3533111
[9]
Sarah Brayne. 2017. Big Data Surveillance: The Case of Policing. Am Sociol Rev 82, 5 (October 2017), 977–1008. https://doi.org/10.1177/0003122417725865
[10]
Sarah Brayne and Angèle Christin. 2021. Technologies of Crime Prediction: The Reception of Algorithms in Policing and Criminal Courts. Social Problems 68, 3 (August 2021), 608–624. https://doi.org/10.1093/socpro/spaa004
[11]
British International Investment. 2022. Managing labour risks and opportunities of platform work. British International Investment (BII)/Swiss Investment Fund for Emerging Markets, London. Retrieved May 2, 2024 from https://assets.bii.co.uk/wp-content/uploads/2022/10/25124342/Platform-work-guidance_BII-and-SIFEM.pdf
[12]
Cliff Brown and Terry Boswell. 1995. Strikebreaking or Solidarity in the Great Steel Strike of 1919: A Split Labor Market, Game-Theoretic, and QCA Analysis. American Journal of Sociology 100, 6 (May 1995), 1479–1519. https://doi.org/10.1086/230669
[13]
Joy Buolamwini and Timnit Gebru. 2018. Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. Proceedings of the 1st Conference on Fairness, Accountability and Transparency (2018).
[14]
Jenna Burrell. 2016. How the machine ‘thinks’: Understanding opacity in machine learning algorithms. Big Data & Society 3, 1 (June 2016), 205395171562251. https://doi.org/10.1177/2053951715622512
[15]
Angèle Christin. 2018. Counting Clicks: Quantification and Variation in Web Journalism in the United States and France. American Journal of Sociology 123, 5 (March 2018), 1382–1415. https://doi.org/10.1086/696137
[16]
Daniel B. Cornfield, Jonathan S. Coley, Larry W. Isaac, and Dennis C. Dickerson. 2018. Occupational Activism and Racial Desegregation at Work: Activist Careers after the Nonviolent Nashville Civil Rights Movement. In Research in the Sociology of Work, Ethel L. Mickey and Adia Harvey Wingfield (eds.). Emerald Publishing Limited, 217–248. https://doi.org/10.1108/S0277-283320180000032014
[17]
Kate Crawford. 2021. Atlas of AI: power, politics, and the planetary costs of artificial intelligence. Yale University Press, New Haven London.
[18]
Taylor M Cruz. 2020. Perils of data-driven equity: Safety-net care and big data's elusive grasp on health inequality. Big Data & Society 7, 1 (January 2020), 205395172092809. https://doi.org/10.1177/2053951720928097
[19]
Christian Davenport. 2009. Media Bias, Perspective, and State Repression: The Black Panther Party (1st ed.). Cambridge University Press. https://doi.org/10.1017/CBO9780511810985
[20]
Marc Dixon, Andrew W. Martin, and Michael Nau. 2016. Social Protest and Corporate Change: Brand Visibility, Third-Party Influence, and the Responsiveness of Corporations to Activist Campaigns*. Mobilization: An International Quarterly 21, 1 (March 2016), 65–82. https://doi.org/10.17813/1086-671X-21-1-65
[21]
Roel I.J. Dobbe, Thomas Krendl Gilbert, and Yonatan Mintz. 2020. Hard Choices in Artificial Intelligence: Addressing Normative Uncertainty through Sociotechnical Commitments. In Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society, February 07, 2020. ACM, New York NY USA, 242–242. https://doi.org/10.1145/3375627.3375861
[22]
Adrian Dusa, cre, cph, Ciprian Paduraru, jQuery Foundation (jQuery library and jQuery UI library), jQuery contributors (jQuery library; authors listed in inst/gui/www/lib/jquery-AUTHORS.txt), Vasil Dinkov (jquery smartmenus js library), Dmitry Baranovskiy (raphael js library), Emmanuel Quentin (raphael inline_text_editing js library), Jimmy Breck-McKye (raphael-paragraph js library), and Alrik Thiem (from version 1 0-0 up to version 1.1-3). 2023. QCA: Qualitative Comparative Analysis. Retrieved November 2, 2023 from https://cran.r-project.org/web/packages/QCA/index.html
[23]
Jennifer Earl, Andrew Martin, John D. McCarthy, and Sarah A. Soule. 2004. The Use of Newspaper Data in the Study of Collective Action. Annu. Rev. Sociol. 30, 1 (August 2004), 65–80. https://doi.org/10.1146/annurev.soc.30.012703.110603
[24]
Virginia Eubanks. 2017. Automating inequality: how high-tech tools profile, police, and punish the poor (First Edition ed.). St. Martin's Press, New York, NY.
[25]
Gil Eyal. 2013. For a Sociology of Expertise: The Social Origins of the Autism Epidemic. American Journal of Sociology 118, 4 (January 2013), 863–907. https://doi.org/10.1086/668448
[26]
Sarah E. Fox, Samantha Shorey, Esther Y. Kang, Dominique Montiel Valle, and Estefania Rodriguez. 2023. Patchwork: The Hidden, Human Labor of AI Integration within Essential Work. Proc. ACM Hum.-Comput. Interact. 7, CSCW1 (April 2023), 1–20. https://doi.org/10.1145/3579514
[27]
Eliot Friedson. 2001. Professionalism: the third logic. University of Chicago Press, Chicago.
[28]
Mary L. Gray and Siddharth Suri. 2019. Ghost work: how to stop Silicon Valley from building a new global underclass. Houghton Mifflin Harcourt, Boston.
[29]
Kathleen Griesbach, Adam Reich, Luke Elliott-Negri, and Ruth Milkman. 2019. Algorithmic Control in Platform Food Delivery Work. Socius 5, (January 2019), 237802311987004. https://doi.org/10.1177/2378023119870041
[30]
International Organisation of Employers. 2022. Diverse forms of work in the platform economy. International Organization of Employers/World Employment Confederation, Geneva. Retrieved May 2, 2024 from https://www.ioe-emp.org/index.php?eID=dumpFile&t=f&f=157415&token=7ee65ed5e89e2ab03eb6c96c852f536969806380
[31]
Lilly C. Irani and M. Six Silberman. 2013. Turkopticon: interrupting worker invisibility in amazon mechanical turk. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, April 27, 2013. ACM, Paris France, 611–620. https://doi.org/10.1145/2470654.2470742
[32]
Larry W. Isaac, Jonathan S. Coley, Quan D. Mai, and Anna W. Jacobs. 2022. Striking News: Discursive Power of the Press as Capitalist Resource in Gilded Age Strikes. American Journal of Sociology 127, 5 (March 2022), 1602–1663. https://doi.org/10.1086/719424
[33]
Kelly Joyce, Laurel Smith-Doerr, Sharla Alegria, Susan Bell, Taylor Cruz, Steve G. Hoffman, Safiya Umoja Noble, and Benjamin Shestakofsky. 2021. Toward a Sociology of Artificial Intelligence: A Call for Research on Inequalities and Structural Change. Socius 7, (January 2021), 237802312199958. https://doi.org/10.1177/2378023121999581
[34]
Katherine C. Kellogg, Melissa A. Valentine, and Angéle Christin. 2020. Algorithms at Work: The New Contested Terrain of Control. ANNALS 14, 1 (January 2020), 366–410. https://doi.org/10.5465/annals.2018.0174
[35]
Walter Korpi. 2006. Power Resources and Employer-Centered Approaches in Explanations of Welfare States and Varieties of Capitalism: Protagonists, Consenters, and Antagonists. World Pol. 58, 2 (January 2006), 167–206. https://doi.org/10.1353/wp.2006.0026
[36]
Min Kyung Lee, Daniel Kusbit, Evan Metsky, and Laura Dabbish. 2015. Working with Machines: The Impact of Algorithmic and Data-Driven Management on Human Workers. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, April 18, 2015. ACM, Seoul Republic of Korea, 1603–1612. https://doi.org/10.1145/2702123.2702548
[37]
Ya-Wen Lei. 2021. Delivering Solidarity: Platform Architecture and Collective Contention in China's Platform Economy. Am Sociol Rev 86, 2 (April 2021), 279–309. https://doi.org/10.1177/0003122420979980
[38]
Margaret Mitchell, Simone Wu, Andrew Zaldivar, Parker Barnes, Lucy Vasserman, Ben Hutchinson, Elena Spitzer, Inioluwa Deborah Raji, and Timnit Gebru. 2019. Model Cards for Model Reporting. In Proceedings of the Conference on Fairness, Accountability, and Transparency, January 29, 2019. ACM, Atlanta GA USA, 220–229. https://doi.org/10.1145/3287560.3287596
[39]
Emanuel Moss, Elizabeth Watkins, Ranjit Singh, Madeleine Clare Elish, and Jacob Metcalf. 2021. Assembling Accountability: Algorithmic Impact Assessment for the Public Interest. SSRN Journal (2021). https://doi.org/10.2139/ssrn.3877437
[40]
Nataliya Nedzhvetskaya and J.S. Tan. 2022. The Role of Workers in AI Ethics and Governance. In The Oxford Handbook of AI Governance (1st ed.), Justin B. Bullock, Yu-Che Chen, Johannes Himmelreich, Valerie M. Hudson, Anton Korinek, Matthew M. Young and Baobao Zhang (eds.). Oxford University Press, C68.S1-C68.N14. https://doi.org/10.1093/oxfordhb/9780197579329.013.68
[41]
Safiya Umoja Noble. 2018. Algorithms of oppression: how search engines reinforce racism. New York University Press, New York.
[42]
Cathy O'Neil. 2017. Weapons of math destruction: how big data increases inequality and threatens democracy (First paperback edition ed.). B/D/W/Y Broadway Books, New York.
[43]
Frank Pasquale. 2016. The black box society: the secret algorithms that control money and information (First Harvard University Press paperback edition ed.). Harvard University Press, Cambridge, Massachusetts London, England.
[44]
Charles Perrow. 1999. Normal accidents: living with high-risk technologies. Princeton University Press, Princeton, N.J.
[45]
Julian Posada. 2021. Unbiased: AI Needs Ethics from Below. AI Now Institute.
[46]
Charles Ragin and Benoit Rihoux. 2004. Qualitative Comparative Analysis (Cqa): State Of The Art And Prospects. (September 2004). https://doi.org/10.5281/ZENODO.998222
[47]
Inioluwa Deborah Raji, Andrew Smart, Rebecca N. White, Margaret Mitchell, Timnit Gebru, Ben Hutchinson, Jamila Smith-Loud, Daniel Theron, and Parker Barnes. 2020. Closing the AI Accountability Gap: Defining an End-to-End Framework for Internal Algorithmic Auditing. (2020). https://doi.org/10.48550/ARXIV.2001.00973
[48]
Vincent J. Roscigno and Randy Hodson. 2004. The Organizational and Social Foundations of Worker Resistance. Am Sociol Rev 69, 1 (February 2004), 14–39. https://doi.org/10.1177/000312240406900103
[49]
Alex Rosenblat and Luke Stark. 2015. Uber's Drivers: Information Asymmetries and Control in Dynamic Work. SSRN Journal (2015). https://doi.org/10.2139/ssrn.2686227
[50]
Stefan Schmalz, Carmen Ludwig, and Edward Webster. 2018. The Power Resources Approach: Developments and Challenges. GLJ 9, 2 (May 2018). https://doi.org/10.15173/glj.v9i2.3569
[51]
Beverly J. Silver. 2003. Forces of labor: workers’ movements and globalization since 1870. Cambridge University Press, Cambridge ; New York.
[52]
Wolfgang Streeck. 2010. E pluribus unum? Varieties and commonalities of capitalism. (2010).
[53]
Berk Ustun, Alexander Spangher, and Yang Liu. 2019. Actionable Recourse in Linear Classification. In Proceedings of the Conference on Fairness, Accountability, and Transparency, January 29, 2019. ACM, Atlanta GA USA, 10–19. https://doi.org/10.1145/3287560.3287566
[54]
Suresh Venkatasubramanian and Mark Alfano. 2020. The philosophical basis of algorithmic recourse. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency, January 27, 2020. ACM, Barcelona Spain, 284–293. https://doi.org/10.1145/3351095.3372876
[55]
Max Weber. 1968. Economy and Society: An Outline of Interpretive Sociology. Bedminster Press, New York.
[56]
David Weil. 2014. The fissured workplace: Why work became so bad for so many and what can be done to improve it. Harvard University Press, Cambridge, Massachusetts London.
[57]
David Gray Widder, Derrick Zhen, Laura Dabbish, and James Herbsleb. 2023. It's about power: What ethical concerns do software engineers have, and what do they (feel they can) do about them? In 2023 ACM Conference on Fairness, Accountability, and Transparency, June 12, 2023. ACM, Chicago IL USA, 467–479. https://doi.org/10.1145/3593013.3594012
[58]
Erik Olin Wright. 2000. Working-Class Power, Capitalist-Class Interests, and Class Compromise. American Journal of Sociology 105, 4 (January 2000), 957–1002. https://doi.org/10.1086/210397

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
FAccT '24: Proceedings of the 2024 ACM Conference on Fairness, Accountability, and Transparency
June 2024
2580 pages
ISBN:9798400704505
DOI:10.1145/3630106
This work is licensed under a Creative Commons Attribution International 4.0 License.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 05 June 2024

Check for updates

Author Tags

  1. algorithmic management
  2. artificial intelligence
  3. expertise
  4. governance
  5. harms
  6. regulation
  7. safety
  8. work

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

  • 100015601

Conference

FAccT '24

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 418
    Total Downloads
  • Downloads (Last 12 months)418
  • Downloads (Last 6 weeks)89
Reflects downloads up to 14 Jan 2025

Other Metrics

Citations

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media