Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3351095.3372844acmconferencesArticle/Chapter ViewAbstractPublication PagesfacctConference Proceedingsconference-collections
research-article

Whose side are ethics codes on?: power, responsibility and the social good

Published: 27 January 2020 Publication History

Abstract

The moral authority of ethics codes stems from an assumption that they serve a unified society, yet this ignores the political aspects of any shared resource. The sociologist Howard S. Becker challenged researchers to clarify their power and responsibility in the classic essay: Whose Side Are We On. Building on Becker's hierarchy of credibility, we report on a critical discourse analysis of data ethics codes and emerging conceptualizations of beneficence, or the "social good", of data technology. The analysis revealed that ethics codes from corporations and professional associations conflated consumers with society and were largely silent on agency. Interviews with community organizers about social change in the digital era supplement the analysis, surfacing the limits of technical solutions to concerns of marginalized communities. Given evidence that highlights the gulf between the documents and lived experiences, we argue that ethics codes that elevate consumers may simultaneously subordinate the needs of vulnerable populations. Understanding contested digital resources is central to the emerging field of public interest technology. We introduce the concept of digital differential vulnerability to explain disproportionate exposures to harm within data technology and suggest recommendations for future ethics codes.

Supplementary Material

PDF File (p230-washington-supp.pdf)
Supplemental material.

References

[1]
Julia Angwin, Jeff Larson, Surya Mattu, and Lauren Kirchner. 2016. Machine bias. ProPublica. Retrieved from https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
[2]
Laurence Anthony. 2013. A critical look at software tools in corpus linguistics. Linguistic Research 30, 2 (2013), 141--161.
[3]
ACM. 1992. ACM Code of Ethics and Professional Conduct. Retrieved from https://ethics.acm.org/code-of-ethics/previous-versions/1992-acm-code/
[4]
ACM U.S. Public Policy Council and ACM Europe Council Policy Committee. 2017. Statement on Internet of Things Privacy and Security. Retrieved from https://www.acm.org/binaries/content/assets/public-policy/2017_joint_statement_iotprivacysecurity.pdf
[5]
ACM US Public Policy Council. 2017. U.S. ACM Statement on Algorithmic Transparency and Accountability. Retrieved from https://www.acm.org/binaries/content/assets/public-policy/2017_usacm_statement_algorithms.pdf
[6]
ACM Committee on Professional Ethics. 2018. ACM Code of Ethics and Professional Conduct. Retrieved from https://ethics.acm.org/2018-code-draft-3/
[7]
Algorithmic Justice League and the Center on Privacy & Technology at Georgetown Law. 2018. Safe Face Pledge. Retrieved from https://www.safefacepledge.org/
[8]
Axciom. 2016. The new codes of conduct: Guiding principles for the ethical use of data. Retrieved from https://marketing.acxiom.com/rs/982-LRE-196/images/ACX001_EthicalUseofData.pdf
[9]
Howard S. Becker. 1967. Whose side are we on? Social Problems 14, 3 (1967), 239--247.
[10]
Jacques J. Berleur and Klaus Brunnstein. 1996. Ethics of computing: Codes, spaces for discussion and law. Springer, New York.
[11]
Ruha Benjamin. 2019. Race after technology: Abolitionist tools for the New Jim Code. Polity Books, London.
[12]
Meredith Broussard. 2018. Artificial unintelligence: How computers misunderstand the world. MIT Press, Boston.
[13]
Joanna J. Bryson, Mihailis E. Diamantis, and Thomas D. Grant. 2017. Of, for, and by the people: the legal lacuna of synthetic persons. Artificial Intelligence Law 25, 3 (September 2017), 273--291.
[14]
Carole Cadwalladr and Emma Graham-Harrison. 2018. How Cambridge Analytica turned Facebook 'likes' into a lucrative political tool. The Guardian, Retrieved August 20, 2019 from https://www.theguardian.com/technology/2018/mar/17/facebook-cambridge-analytica-kogan-data-algorithm
[15]
Danielle Citron. 2014. Hate crimes in cyberspace. Harvard University Press, Boston.
[16]
Richard Cowell, James Downe, and Karen Morgan. 2014. Managing politics? Ethics regulation and conflicting conceptions of "good conduct." Public Administration Review 74, 1 (January 2014), 29--38.
[17]
Data Science Association. 2017. Data Science Code Of Professional Conduct. Retrieved from https://www.datascienceassn.org/code-of-conduct.html
[18]
Teun A. van Dijk and Walter Kintsch. 1983. Strategies of discourse comprehension. Academic Press, New York.
[19]
Teun van Dijk. 2006. Discourse and manipulation. Discourse & Society 17, (May 2006), 359--383.
[20]
Ruth R Faden, Nancy M. P King, and Tom L Beauchamp. 1986. A history and theory of informed consent. Oxford University Press, New York.
[21]
Catherine Flick. 2016. Informed consent and the Facebook emotional manipulation study. Research Ethics 12, 1 (January 2016), 14--28.
[22]
Batya Friedman and Helen Nissenbaum. 1996. Bias in computer systems. ACM Transactions on Information Systems 14, 3 (July 1996), 330--347.
[23]
Future of Life Institute. 2017. Asilomar AI Principles. Retrieved from https://futureoflife.org/ai-principles/
[24]
Timnit Gebru, Jamie Morgenstern, Briana Vecchione, Jennifer Wortman Vaughan, Hanna Wallach, Hal Daumeé III, and Kate Crawford. 2018. Datasheets for Datasets. In Proceedings of the 5th Workshop on Fairness, Accountability, and Transparency in Machine Learning FAT/ML 2018.
[25]
Tarleton Gillespie. 2010. The politics of 'platforms.' New Media & Society, 12, 3 (May 2010), 347--364.
[26]
Google. 2018. AI at Google: Our Principles. Retrieved from https://ai.google/principles/
[27]
Jonathan Heawood. 2018. Pseudo-public political speech: Democratic implications of the Cambridge Analytica scandal. Information Polity 23, 4 (December 2018), 429--434.
[28]
Kashmir Hill. 2019. I cut the "Big Five" tech giants from my life. It was hell, Gizmodo. Retrieved from https://gizmodo.com/i-cut-the-big-five-tech-giants-from-my-life-it-was-hel-1831304194
[29]
Adrian Holliday. 2007. Doing and writing qualitative research. Sage Publications, London; Thousand Oaks.
[30]
JSAI (Japanese Society for Artificial Intelligence) 2017 The Japanese Society for Artificial Intelligence Ethical Guidelines
[31]
Grace Kyungwon Hong. 2015. Death Beyond Disavowal: The Impossible Politics of Difference. University of Minnesota Press, Minneapolis, Minn.
[32]
IEEE Global Initiative for Ethical Considerations in Artificial Intelligence and Autonomous Systems. 2018. Ethically Aligned Design: A Vision for Prioritizing Human Wellbeing with Artificial Intelligence and Autonomous Systems. Retrieved from https://standards.ieee.org/content/dam/ieee-standards/standards/web/documents/other/ead_v2.pdf
[33]
IBM. 2018. Everyday Ethics for Artificial Intelligence: A Practical Guide for Designers and Developers. Retrieved from https://www.ibm.com/watson/assets/duo/pdf/everydayethics.pdf
[34]
The Institute for Ethical AI & Machine Learning. 2019. The Responsible Machine Learning Principles. Retrieved from https://ethical.institute/principles.html
[35]
Intel. 2017. Artificial Intelligence: The Public Policy Opportunity. Retrieved from https://blogs.intel.com/policy/files/2017/10/Intel-Artificial-Intelligence-Public-Policy-White-Paper-2017.pdf
[36]
Laura James. 2018. Oaths, pledges and manifestos: a master list of ethical tech values. Doteveryone. Retrieved from https://medium.com/doteveryone/oaths-pledges-and-manifestos-a-master-list-of-ethical-tech-values-26e2672e161c
[37]
Sheila Jasanoff. 2007. Technologies of humility. Nature 450, 7166 (November 2007), 33.
[38]
Rachel Kuo. 2018. Racial justice activist hashtags: Counterpublics and discourse circulation. New Media and Society 20, 2, 495 - 514.
[39]
Lawrence Lessig. 2006. Code: Version 2.0. Basic Books, New York, New York.
[40]
Yvonna S. Lincoln and Egon G. Guba. 1985. Naturalistic inquiry. Sage Publications, Beverly Hills, Calif.
[41]
Katja Lindskov Jacobsen. 2010. Making design safe for citizens: A hidden history of humanitarian experimentation. Citizenship Studies 14, 1 (February 2010), 89--103.
[42]
Mike Loukides, Hilary Mason, and DJ Patil. 2018. Of oaths and checklists. O'REILLY. Retrieved from https://www.oreilly.com/ideas/of-oaths-and-checklists
[43]
Jennifer Mason. 2002. Making convincing arguments with qualitative data. In Qualitative researching. Sage Publications, London; Thousand Oaks, Calif.
[44]
Yeshimabeit Milner. 2018. An open letter to Facebook from the Data for Black Lives Movement. Medium. Retrieved from https://medium.com/@YESHICAN/an-open-letter-to-facebook-from-the-data-for-black-lives-movement-81e693c6b46c
[45]
Microsoft. 2018. Responsible bots: 10 guidelines for developers of conversational AI. November 4.
[46]
National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. 1979. The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research. Retrieved from https://www.hhs.gov/ohrp/regulations-and-policy/belmont-report/read-the-belmont-report/index.html
[47]
Gina Neff, Anissa Tanweer, Brittany Fiore-Gartland, and Laura Osburn. 2017. Critique and contribute: A practice-based framework for improving critical data studies and data science. Big Data 5, 2 (June 2017), 85--97.
[48]
Safiya Umoja Noble. 2018. Algorithms of oppression: How search engines Reinforce racism. New York University Press, New York, NY.
[49]
Effy Oz. 1992. Ethical Standards for Information Systems Professionals: A Case for a Unified Code. MIS Quarterly 16, 4 (1992), 423--433.
[50]
Henry Petroski. 1992. To engineer is human : the role of failure in successful design. Vintage Books, New York.
[51]
Irene Pollach. 2012. Taming textual data: The contribution of corpus linguistics to computer-aided text analysis. Organizational Research Methods 15, 2 (April 2012), 263-- 287.
[52]
David B. Resnik, Kevin C. Elliott, Patricia A. Soranno, and Elise M. Smith. 2017. Data-intensive science and research integrity. Accountability in Research 24, 6 (August 2017), 344--358.
[53]
Lyn Richards. 1998. Closeness to data: The changing goals of qualitative data handling. Qualitative Health Research 8, 3 (May 1998), 319--328.
[54]
Aaron Sankin. 2017. How activists of color lose battles against Facebook's moderator army. Reveal: The Center for Investigative Reporting. Retrieved from https://www.revealnews.org/article/how-activists-of-color-lose-battles-against-facebooks-moderator-army/
[55]
Maarten Sap, Dallas Card, Saadia Gabriel, Yejin Choi, and Noah Smith. 2019. The risk of racial bias in hate speech detection. 57th Annual Meeting of the Association for Computational Linguistics, presented in Florence, Italy. https://homes.cs.washington.edu/~skgabrie/sap2019risk.pdf
[56]
Zachary M Schrag. 2010. Ethical imperialism: institutional review boards and the social sciences, 1965-2009. Johns Hopkins University Press, Baltimore.
[57]
Donald A. Schön. 1983. The reflective practitioner: how professionals think in action. New York: Basic Books, 1983.
[58]
Evelyne Shuster. 1998. The Nuremberg code: Hippocratic ethics and human rights. The Lancet 351, 9107 (March 1998), 974--977.
[59]
Susan Leigh Star and Geoffrey C. Bowker. 2007. Enacting silence: residual categories as a challenge for ethics, information systems, and communication. Ethics Information Technology 9, 4 (December 2007), 273--280.
[60]
Susan Leigh Star and Anselm Strauss. Layers of silence, arenas of voice: The ecology of visible and invisible work. CSCW 8, 1/2 (Feb. 1999), 9--30.
[61]
Marlise Silver Sweeney. 2014. What the law can (and can't) do about online harassment. The Atlantic. Retrieved from https://www.theatlantic.com/technology/archive/2014/11/what-the-law-can-and-cant-do-about-online-harassment/382638/
[62]
Robert Vargas, Kayla Preito-Hodge and Jeremy Christofferson. 2019. Digital vulnerability: The unequal risk of e-contact with the criminal justice system." RSF: Russell Sage Foundation Journal of the Social Sciences, 5, 1 (February 2019), 71--88.
[63]
Anne L. Washington. 2019 How to Argue with an Algorithm: Lessons from the COMPAS ProPublica Debate. The Colorado Technology Law Journal. Volume 17 Issue 1 http://ctlj.colorado.edu/?page_id=635
[64]
Anne L. Washington. 2018. Facebook math: How 270,000 became 87 million. Data and Society Points. Retrieved August 20, 2019 from https://points.datasociety.net/facebook-math-how-270-000-became-87-million-bd8cf1009b32
[65]
White House. 2014. Big Data and Privacy: a technological perspective. Executive Office of the President, President's Council of Advisors on Science and Technology, Washington, DC.
[66]
Sylvia Wynter. 2003. Unsettling the coloniality of being/power/truth/freedom: Towards the human, after man, its overrepresentation---an argument. CR: The New Centennial Review 3, 3 (2003), 257--337.
[67]
Matthew Zook, Solon Barocas, danah boyd, Kate Crawford, Emily Keller, Seeta Peña Gangadharan, Alyssa Goodman, Rachelle Hollander, Barbara A. Koenig, Jacob Metcalf, Arvind Narayanan, Alondra Nelson, and Frank Pasquale. 2017. Ten simple rules for responsible big data research. PLoS Computational Biology 13, 3 (March 2017), e1005399.
[68]
Andrej Zwitter. 2014. Big Data ethics. Big Data & Society 1, 2 (July 2014).

Cited By

View all
  • (2024)State-led embeddedness: Analyzing the discursive construction of platforms and social good in Beijing, Hangzhou, Shanghai, and ShenzhenGlobal Media and China10.1177/20594364241226845Online publication date: 9-Jan-2024
  • (2024)Teaching Ethics & Activism in a Human-Computer Interaction Professional Master's ProgramProceedings of the 55th ACM Technical Symposium on Computer Science Education V. 110.1145/3626252.3630939(1140-1146)Online publication date: 7-Mar-2024
  • (2023)Reframing data ethics in research methods education: a pathway to critical data literacyInternational Journal of Educational Technology in Higher Education10.1186/s41239-023-00380-y20:1Online publication date: 20-Feb-2023
  • Show More Cited By

Index Terms

  1. Whose side are ethics codes on?: power, responsibility and the social good

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      FAT* '20: Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency
      January 2020
      895 pages
      ISBN:9781450369367
      DOI:10.1145/3351095
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 27 January 2020

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. data science
      2. digital differential vulnerability
      3. digital vulnerability
      4. ethics codes
      5. public interest technology
      6. social movements

      Qualifiers

      • Research-article

      Conference

      FAT* '20
      Sponsor:

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)158
      • Downloads (Last 6 weeks)10
      Reflects downloads up to 18 Aug 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)State-led embeddedness: Analyzing the discursive construction of platforms and social good in Beijing, Hangzhou, Shanghai, and ShenzhenGlobal Media and China10.1177/20594364241226845Online publication date: 9-Jan-2024
      • (2024)Teaching Ethics & Activism in a Human-Computer Interaction Professional Master's ProgramProceedings of the 55th ACM Technical Symposium on Computer Science Education V. 110.1145/3626252.3630939(1140-1146)Online publication date: 7-Mar-2024
      • (2023)Reframing data ethics in research methods education: a pathway to critical data literacyInternational Journal of Educational Technology in Higher Education10.1186/s41239-023-00380-y20:1Online publication date: 20-Feb-2023
      • (2023)Trustworthy AI and the Logics of Intersectional ResistanceProceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency10.1145/3593013.3593986(172-182)Online publication date: 12-Jun-2023
      • (2023)Why We Should Supplement Ethics with CitizenshipProceedings of the 2023 ACM Conference on Information Technology for Social Good10.1145/3582515.3609519(68-73)Online publication date: 6-Sep-2023
      • (2023)OptimizeEthical Data Science10.1093/oso/9780197693025.003.0006(83-100)Online publication date: 23-Nov-2023
      • (2023)Ethical Data Science10.1093/oso/9780197693025.001.0001Online publication date: 23-Nov-2023
      • (2023)Integrating Ethics into the Guidelines for Assessment and Instruction in Statistics Education (GAISE)The American Statistician10.1080/00031305.2022.215661277:3(323-330)Online publication date: 4-Jan-2023
      • (2023)Accelerating Product Success: Designing a Digital Adoption Framework to Elevate Developer ExperiencesTransfer, Diffusion and Adoption of Next-Generation Digital Technologies10.1007/978-3-031-50192-0_24(277-287)Online publication date: 13-Dec-2023
      • (2022)No! Re-imagining Data Practices Through the Lens of Critical RefusalProceedings of the ACM on Human-Computer Interaction10.1145/35579976:CSCW2(1-20)Online publication date: 11-Nov-2022
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media