Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3424953.3426545acmotherconferencesArticle/Chapter ViewAbstractPublication PagesihcConference Proceedingsconference-collections
research-article

Exploring user profiles based on their explainability requirements in interactive systems

Published: 23 December 2020 Publication History

Abstract

Depending on the way they are designed, interactive systems may inadvertently influence the opinions, choices and actions of their users. To avoid this problem, there has been a growing demand for this type of system to be able to explain to users how their outputs are generated. This demand is called the "right to an explanation", or the explainability requirement. User perceptions are still little addressed in this context, for example: (1) do people value software explainability? (2) what are the typical user profiles in terms of their demand for explainability? This work seeks to elucidate users' perception and to identify profiles of users in terms of their perceptions of explainability requirement. The method consists of a questionnaire, data mining algorithms and statistical analysis to uncover and analyze the profiles of users. In a case study conducted with 61 people, we found 6 profiles, each one representing a different perception and type of requirement about software explainability. The implications of these profiles for the design of interactive systems are discussed.

References

[1]
Simone Barbosa and Bruno Silva. 2010. Interação Humano-Computador. Elsevier Brasil, Brazil.
[2]
Rafael Brandão, Joel Carbonera, Clarisse de Souza, Juliana Jansen Ferreira, Bernardo Gonçalves, and Carla Leitão. 2019. Mediation Challenges and Socio-Technical Gap for Explainable Deep Learning Applications. CoRR abs/1907.07178 (2019). arXiv:1907.07178
[3]
Joel Carbonera, Bernardo Gonçalves, and Clarisse de Souza. 2018. O problema da explicação em Inteligência Artificial: considerações a partir da semiótica. TECCOGS: Revista Digital de Tecnologias Cognitivas 17 (2018), 59--75.
[4]
Shuo Chang, Maxwell F. Harper, and Loren Terveen. 2016. Crowd-based personalized natural language explanations for recommendations. In Proceedings of the 10th ACM Conference on Recommender Systems. ACM, 175--182.
[5]
Bruna Ferreira, Simone Barbosa, and Tayana Conte. 2018. Creating Personas Focused on Representing Potential Requirements to Support the Design of Applications. In Proceedings of the 17th Brazilian Symposium on Human Factors in Computing Systems (Belém, Brazil) (IHC 2018). ACM, New York, NY, USA.
[6]
Bryce Goodman and Seth Flaxman. 2017. European Union regulations on algorithmic decision-making and a "right to explanation". AI magazine 38, 3 (2017), 50--57.
[7]
Maximilian A Köhl, Kevin Baum, Markus Langer, Daniel Oster, Timo Speith, and Dimitri Bohlender. 2019. Explainability as a non-functional requirement. In 2019 IEEE 27th International Requirements Engineering Conference (RE). IEEE, 363--368.
[8]
Julio Leite and Claudia Cappelli. 2010. Software Transparency. Business & Information Systems Engineering 2, 3 (2010), 127--139.
[9]
Jakob Nielsen. 1994. Usability engineering. Elsevier.
[10]
Ingrid Nunes and Dietmar Jannach. 2017. A systematic review and taxonomy of explanations in decision support and recommender systems. User Modeling and User-Adapted Interaction 27, 3 (2017), 393--444.
[11]
Lesandro Ponciano, Pedro Barbosa, Francisco Brasileiro, Andrey Brito, and Nazareno Andrade. 2017. Designing for Pragmatists and Fundamentalists: Privacy Concerns and Attitudes on the Internet of Things. In Proceedings of the XVI Brazilian Symposium on Human Factors in Computing Systems (Joinville, Brazil) (IHC 2017). ACM, NY, US, Article 21, 10 pages.
[12]
Lesandro Ponciano and Francisco Brasileiro. 2014. Finding Volunteers' Engagement Profiles in Human Computation for Citizen Science Projects. Human Computation 1, 2 (2014), 245--264.
[13]
Lesandro Ponciano, Francisco Brasileiro, Nazareno Andrade, and Lívia Sampaio. 2014. Considering human aspects on strategies for designing and managing distributed human computation. Journal of Internet Services and Applications 5, 1 (2014), 10.
[14]
Raquel Prates, Clarisse de Souza, and Simone Barbosa. 2000. Methods and tools: a method for evaluating the communicability of user interfaces. interactions 7, 1 (2000), 31--38.
[15]
Yvonne Rogers, Helen Sharp, and Jenny Preece. 2011. Interaction design: beyond human-computer interaction. John Wiley & Sons.
[16]
Jim Thatcher, Cynthia Waddell, and Michael Burks. 2002. Constructing accessible web sites. Vol. 34. Glasshaus Birmingham.
[17]
Nava Tintarev and Judith Masthoff. 2012. Evaluating the effectiveness of explanations for recommender systems. User Modeling and User-Adapted Interaction 22, 4 (01 Oct 2012), 399--439.

Cited By

View all
  • (2022)Modeling and Evaluating Personas with Software Explainability RequirementsHuman-Computer Interaction10.1007/978-3-030-92325-9_11(136-149)Online publication date: 1-Jan-2022

Index Terms

  1. Exploring user profiles based on their explainability requirements in interactive systems

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      IHC '20: Proceedings of the 19th Brazilian Symposium on Human Factors in Computing Systems
      October 2020
      519 pages
      ISBN:9781450381727
      DOI:10.1145/3424953
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      In-Cooperation

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 23 December 2020

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. explainability
      2. k-means
      3. profiles
      4. user perception

      Qualifiers

      • Research-article

      Conference

      IHC '20

      Acceptance Rates

      IHC '20 Paper Acceptance Rate 60 of 155 submissions, 39%;
      Overall Acceptance Rate 331 of 973 submissions, 34%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)15
      • Downloads (Last 6 weeks)2
      Reflects downloads up to 10 Nov 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2022)Modeling and Evaluating Personas with Software Explainability RequirementsHuman-Computer Interaction10.1007/978-3-030-92325-9_11(136-149)Online publication date: 1-Jan-2022

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media