Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Leveraging Implicit Gaze-Based User Feedback for Interactive Machine Learning

  • Conference paper
  • First Online:
KI 2022: Advances in Artificial Intelligence (KI 2022)

Abstract

Interactive Machine Learning (IML) systems incorporate humans into the learning process to enable iterative and continuous model improvements. The interactive process can be designed to leverage the expertise of domain experts with no background in machine learning, for instance, through repeated user feedback requests. However, excessive requests can be perceived as annoying and cumbersome and could reduce user trust. Hence, it is mandatory to establish an efficient dialog between a user and a machine learning system. We aim to detect when a domain expert disagrees with the output of a machine learning system by observing its eye movements and facial expressions. In this paper, we describe our approach for modelling user disagreement and discuss how such a model could be used for triggering user feedback requests in the context of interactive machine learning.

Supported by organization x.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 44.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 59.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Amershi, S., Cakmak, M., Knox, W.B., Kulesza, T.: Power to the people: the role of humans in interactive machine learning. AI Mag. 35(4), 105–120 (2014). https://doi.org/10.1609/aimag.v35i4.2513

  2. Barz, M., Bhatti, O.S., Lüers, B., Prange, A., Sonntag, D.: Multisensor-pipeline: a lightweight, flexible, and extensible framework for building multimodal-multisensor interfaces. In: Companion Publication of the 2021 International Conference on Multimodal Interaction, ICMI 2021 Companion, pp. 13–18. Association for Computing Machinery, New York (2021). https://doi.org/10.1145/3461615.3485432. ISBN 9781450384711

  3. Barz, M., Bhatti, O.S., Sonntag, D.: Implicit estimation of paragraph relevance from eye movements. Front. Comput. Sci. 3, 808507 (2021). https://doi.org/10.3389/fcomp.2021.808507

  4. Barz, M., Stauden, S., Sonntag, D.: Visual search target inference in natural interaction settings with machine learning. In: Bulling, A., Huckauf, A., Jain, E., Radach, R., Weiskopf, D. (eds.) ETRA 2020: 2020 Symposium on Eye Tracking Research and Applications, Stuttgart, Germany, 2–5 June 2020, pp. 1:1–1:8. ACM (2020). https://doi.org/10.1145/3379155.3391314

  5. Bosch, N., Chen, Y., D’Mello, S.: It’s written on your face: detecting affective states from facial expressions while learning computer programming. In: Trausan-Matu, S., Boyer, K.E., Crosby, M., Panourgia, K. (eds.) Intelligent Tutoring Systems, pp. 39–44. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-07221-0_5. ISBN 978-3-319-07221-0

  6. Cakmak, M., Chao, C., Thomaz, A.L.: Designing interactions for robot active learners. IEEE Trans. Auton. Ment. Dev. 2(2), 108–118 (2010). https://doi.org/10.1109/TAMD.2010.2051030

    Article  Google Scholar 

  7. D’Mello, S.K., Craig, S.D., Graesser, A.C.: Multimethod assessment of affective experience and expression during deep learning. Int. J. Learn. Technol. 4(3/4), 165–187 (2009). https://doi.org/10.1504/IJLT.2009.028805. ISSN 1477–8386

  8. Dudley, J.J., Kristensson, P.O.: A review of user interface design for interactive machine learning. ACM Trans. Interact. Intell. Syst. 8(2) (2018). https://doi.org/10.1145/3185517. ISSN 2160–6455

  9. D’Mello, S.K., Graesser, A.C.: Confusion. In: International Handbook of Emotions in Education, pp. 299–320. Routledge (2014)

    Google Scholar 

  10. Ekman, P., et al.: Universals and cultural differences in the judgments of facial expressions of emotion. J. Pers. Soc. Psychol. 53(4), 712 (1987)

    Article  Google Scholar 

  11. Ghajargar, M., Persson, J., Bardzell, J., Holmberg, L., Tegen, A.: The UX of interactive machine learning. Association for Computing Machinery, New York (2020). https://doi.org/10.1145/3419249.3421236. ISBN 9781450375795

  12. Honeycutt, D., Nourani, M., Ragan, E.: Soliciting human-in-the-loop user feedback for interactive machine learning reduces user trust and impressions of model accuracy. In: Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, vol. 8, no. 1, pp. 63–72, October 2020. https://ojs.aaai.org/index.php/HCOMP/article/view/7464

  13. Khaireddin, Y., Chen, Z.: Facial emotion recognition: state of the art performance on FER2013. arXiv preprint arXiv:2105.03588 (2021)

  14. Krause, L., Vossen, P.: When to explain: identifying explanation triggers in human-agent interaction. In: 2nd Workshop on Interactive Natural Language Technology for Explainable Artificial Intelligence, pp. 55–60 (2020)

    Google Scholar 

  15. Lallé, S., Conati, C., Carenini, G.: Predicting confusion in information visualization from eye tracking and interaction data. In: Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence, IJCAI 2016, pp. 2529–2535. AAAI Press (2016). ISBN 9781577357704

    Google Scholar 

  16. Lim, J.Z., Mountstephens, J., Teo, J.: Emotion recognition using eye-tracking: taxonomy, review and current challenges. Sensors 20(8) (2020). https://doi.org/10.3390/s20082384. ISSN 1424–8220. https://www.mdpi.com/1424-8220/20/8/2384

  17. Nadkarni, S., Gupta, R.: A task-based model of perceived website complexity. MIS Q. 31(3), 501–524 (2007). ISSN 02767783. https://www.jstor.org/stable/25148805

  18. Pachman, M., Arguel, A., Lockyer, L., Kennedy, G., Lodge, J.: Eye tracking and early detection of confusion in digital learning environments: proof of concept. Australas. J. Educ. Technol. 32(6) (2016). https://doi.org/10.14742/ajet.3060. https://ajet.org.au/index.php/AJET/article/view/3060

  19. Pentel, A.: Patterns of confusion: using mouse logs to predict user’s emotional state. In: Cristea, A.I., Masthoff, J., Said, A., Tintarev, N. (eds.) Posters, Demos, Late-Breaking Results and Workshop Proceedings of the 23rd Conference on User Modeling, Adaptation, and Personalization (UMAP 2015), Dublin, Ireland, 29 June–3 July 2015, CEUR Workshop Proceedings, vol. 1388. CEUR-WS.org (2015). https://ceur-ws.org/Vol-1388/PALE2015-paper5.pdf

  20. Pollak, M., Salfinger, A., Hummel, K.A.: Teaching drones on the fly: can emotional feedback serve as learning signal for training artificial agents? arXiv preprint arXiv:2202.09634 (2022)

  21. Salminen, J., Jansen, B.J., An, J., Jung, S.G., Nielsen, L., Kwak, H.: Fixation and confusion: investigating eye-tracking participants’ exposure to information in personas. In: Proceedings of the 2018 Conference on Human Information Interaction & Retrieval, CHIIR 2018, pp. 110–119. Association for Computing Machinery, New York (2018). https://doi.org/10.1145/3176349.3176391. ISBN 9781450349253

  22. Salminen, J., Nagpal, M., Kwak, H., An, J., Jung, S.g., Jansen, B.J.: Confusion prediction from eye-tracking data: experiments with machine learning. In: Proceedings of the 9th International Conference on Information Systems and Technologies, ICIST 2019. Association for Computing Machinery, New York (2019). https://doi.org/10.1145/3361570.3361577. ISBN 9781450362924

  23. Sims, S.D., Conati, C.: A neural architecture for detecting user confusion in eye-tracking data. In: Proceedings of the 2020 International Conference on Multimodal Interaction, ICMI 2020, pp. 15–23. Association for Computing Machinery, New York (2020). https://doi.org/10.1145/3382507.3418828. ISBN 9781450375818

  24. Zacharias, J., Barz, M., Sonntag, D.: A survey on deep learning toolkits and libraries for intelligent user interfaces (2018)

    Google Scholar 

  25. Zeng, Z., Pantic, M., Roisman, G.I., Huang, T.S.: A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31(1), 39–58 (2009). https://doi.org/10.1109/TPAMI.2008.52

    Article  Google Scholar 

Download references

Acknowledgements

This work was funded by the German Federal Ministry of Education and Research (BMBF) under grant number 01JD1811C (GeAR).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Omair Bhatti .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Bhatti, O., Barz, M., Sonntag, D. (2022). Leveraging Implicit Gaze-Based User Feedback for Interactive Machine Learning. In: Bergmann, R., Malburg, L., Rodermund, S.C., Timm, I.J. (eds) KI 2022: Advances in Artificial Intelligence. KI 2022. Lecture Notes in Computer Science(), vol 13404. Springer, Cham. https://doi.org/10.1007/978-3-031-15791-2_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-15791-2_2

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-15790-5

  • Online ISBN: 978-3-031-15791-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics