Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3581641.3584084acmconferencesArticle/Chapter ViewAbstractPublication PagesiuiConference Proceedingsconference-collections
research-article
Open access

Physiologically Attentive User Interface for Improved Robot Teleoperation

Published: 27 March 2023 Publication History

Abstract

User interfaces (UI) are shifting from being attention-hungry to being attentive to users’ needs upon interaction. Interfaces developed for robot teleoperation can be particularly complex, often displaying large amounts of information, which can increase the cognitive overload that prejudices the performance of the operator. This paper presents the development of a Physiologically Attentive User Interface (PAUI) prototype preliminary evaluated with six participants. A case study on Urban Search and Rescue (USAR) operations that teleoperate a robot was used although the proposed approach aims to be generic. The robot considered provides an overly complex Graphical User Interface (GUI) which does not allow access to its source code. This represents a recurring and challenging scenario when robots are still in use, but technical updates are no longer offered that usually mean their abandon. A major contribution of the approach is the possibility of recycling old systems while improving the UI made available to end users and considering as input their physiological data. The proposed PAUI analyses physiological data, facial expressions, and eye movements to classify three mental states (rest, workload, and stress). An Attentive User Interface (AUI) is then assembled by recycling a pre-existing GUI, which is dynamically modified according to the predicted mental state to improve the user's focus during mentally demanding situations. In addition to the novelty of the proposed PAUIs that take advantage of pre-existing GUIs, this work also contributes with the design of a user experiment comprising mental state induction tasks that successfully trigger high and low cognitive overload states. Results from the preliminary user evaluation revealed a tendency for improvement in the usefulness and ease of usage of the PAUI, although without statistical significance, due to the reduced number of subjects.

References

[1]
A. Bulling, “Pervasive Attentive User Interfaces,” IEEE Computer, vol.49, no. 1, pp. 94–98, 2016.
[2]
R. Vertegaal, “Attentive user interfaces,” Communications of the ACM, vol. 46, no. 3, pp.30–33, 2003.
[3]
A. C. Dirican and M. Göktürk, “Psychophysiological measures of human cognitive states applied in human computer interaction,” Procedia Computer Science, vol. 3, pp. 1361–1367, 2011.
[4]
D. Chen, The Physiologically Attentive User Interface Towards A Physiological Model of Interruptability. Citeseer, 2006.
[5]
R. Murphy, S. Tadokoro, D. Nardi, A. Jacoff, P. Fiorini, H. Choset, and A. Erkmen, Search and Rescue Robotics, 2008, pp. 1151–1173.
[6]
C. Marques, J. Cristóvão, P. Alvito, P. Lima, J. Frazão, I. Ribeiro, and R. Ventura, “A search and rescue robot with tele-operated tether docking system,” Industrial Robot: An International Journal, vol. 34, no. 4, pp.332–338, 2007.
[7]
G. Singh, S. Bermúdez i Badia, R. Ventura, and J. L. Silva, “Physiologically attentive user interface for robot teleoperation: real time emotional state estimation and interface modification using physiology, facial expressions and eye movements,” in 11th International Joint Conference on Biomedical Engineering Systems and Technologies. SCITEPRESS Science and Technology Publications, 2018, pp. 294–302.
[8]
R. Vertegaal, J. S. Shell, D. Chen, and A. Mamuji, “Designing for augmented attention: Towards a framework for attentive user interfaces,” Computers in Human Behavior, vol. 22, no. 4, pp. 771–789, 2006.
[9]
P. Wintersberger, C. Schartmüller, and A. Riener, “Attentive User Interfaces to Improve Multitasking and Take-Over Performance in Automated Driving: The Auto-Net of Things,” International Journal of Mobile Human Computer Interaction (IJMHCI), vol. 11, no. 3, pp. 40–58, 2019.
[10]
J. L. Castellanos-Cruz, M. F. Gómez-Medina, M. Tavakoli, P. M. Pilarski, and K. Adams, “Comparison of Attentive and Explicit Eye Gaze Interfaces for Controlling Haptic Guidance of a Robotic Controller,” Journal of Medical Robotics Research, vol. 4, no. 03n04, p. 1950005, 2019.
[11]
D. Chen and R. Vertegaal, “Using mental load for managing interruptions in physiologically attentive user interfaces,” in CHI’04 Extended Abstracts on Human Factors in Computing Systems, 2004, pp.1513–1516.
[12]
Delmerico, J., Mintchev, S., Giusti, A., Gromov, B., Melo, K., Horvat, T., ... & Scaramuzza, D. (2019). The current state and future outlook of rescue robotics. Journal of Field Robotics, 36(7), 1171-1191.
[13]
J. R. Millan, F. Renkens, J. Mourino, and W. Gerstner, “Noninvasive brain actuated control of a mobile robot by human EEG,” IEEE Transactions on Biomedical Engineering, vol. 51, no. 6, pp.1026–1033, 2004.
[14]
B. Zhang, J. Wang, and T. Fuhlbrigge, “A review of the commercial brain computer interface technology from perspective of industrial robotics,” in 2010 IEEE International Conference on Automation and Logistics. IEEE,2010, pp. 379–384.
[15]
M. Baker, R. Casey, B. Keyes, and H. A. Yanco, “Improved interfaces for human-robot interaction in urban search and rescue,” in 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No. 04CH37583), vol. 3. IEEE, 2004, pp. 2960–2965.
[16]
J. M. Riley and M. R. Endsley, “The hunt for situation awareness: Human robot interaction in search and rescue,” in Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 48, no. 3. SAGE Publications Sage CA: Los Angeles, CA, 2004, pp. 693–697.
[17]
K. Anderson and P. W. McOwan, “A real-time automated system for the recognition of human facial expressions,” IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), vol. 36, no. 1, pp. 96–105,2006.
[18]
I. Hupont, S. Baldassarri, and E. Cerezo, “Facial emotional classification: from a discrete perspective to a continuous emotional space,” Pattern Analysis and Applications, vol. 16, no. 1, pp. 41–54, 2013.
[19]
F. Agrafioti, D. Hatzinakos, and A. K. Anderson, “ECG pattern analysis for emotion detection,” IEEE Transactions on Affective Computing, vol. 3, no. 1, pp. 102–115, 2011.
[20]
P. C. Petrantonakis and L. J. Hadjileontiadis, “A novel emotion elicitation index using frontal brain asymmetry for enhanced EEG-based emotion recognition,” IEEE Transactions on Information Technology in Biomedicine, vol. 15, no. 5, pp. 737–746, 2011.
[21]
J.Wagner, J. Kim, and E. André, “From physiological signals to emotions: Implementing and comparing selected methods for feature extraction and classification,” in 2005 IEEE International Conference on Multimedia and Expo. IEEE, 2005, pp. 940–943.
[22]
J. Chen, D. Jiang, and Y. Zhang, “A Hierarchical Bidirectional GRU Model With Attention for EEG-Based Emotion Classification,” IEEE Access, vol.7, pp. 118 530–118 540, 2019.
[23]
K. H. Kim, S.W. Bang, and S. R. Kim, “Emotion recognition system using short-term monitoring of physiological signals,” Medical and Biological Engineering and Computing, vol. 42, no. 3, pp.419–427, 2004.
[24]
X.-W. Wang, D. Nie, and B.-L. Lu, “Emotional state classification from EEG data using machine learning approach,” Neurocomputing, vol. 129, pp. 94–106, 2014.
[25]
W.-L. Zheng, J.-Y. Zhu, Y. Peng, and B.-L. Lu, “EEG-based emotion classification using deep belief networks,” in 2014 IEEE International Conference on Multimedia and Expo (ICME). IEEE, 2014, pp. 1–6.
[26]
O. B˘alan, G. Moise, L. Petrescu, A. Moldoveanu, M. Leordeanu, and F. Moldoveanu, “Emotion Classification Based on Biophysical Signals and Machine Learning Techniques,” Symmetry, vol. 12, no. 1, p. 21, 2020.
[27]
R. Hocke. SikuliX. (2017) [Online]. Available: http://www.sikulix.com,Accessed on: Jun. 28, 2020.
[28]
J. L. Silva, J. D. Ornelas, and J. C. Silva, "Make it ISI: interactive systems integration tool," in Proceedings of the 8th ACM SIGCHI Symposium on Engineering Interactive Computing Systems, 2016, pp. 245-250.
[29]
R. Maia, J. C. Silva, and J. L. Silva, "Towards Graphical User Interface Redefinition without Source Code Access: System Design and Evaluation," in Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services, 2019, pp. 1-6.
[30]
D. McDuff, A. Mahmoud, M. Mavadati, M. Amr, J. Turcot, and R. e.Kaliouby, “AFFDEX SDK: a cross-platform real-time multi-face expression recognition toolkit,” in Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, 2016, pp.3723–3726.
[31]
D. Anguita, L. Ghelardoni, A. Ghio, L. Oneto, and S. Ridella, “The ’K’ in K-fold Cross Validation,” in ESANN, 2012.
[32]
F. Chollet, Deep Learning with Python. Manning Publications, 2017.
[33]
D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” arXiv preprint arXiv:1412.6980, 2014.
[34]
A. Géron, Hands-On Machine Learning with Scikit-Learn & TensorFlow. O'Reilly Media, 2017.
[35]
ISO (2018) Ergonomics of human-system interaction—Part 11: Usability: Definitions and concepts (ISO/DIS 9241-11:2018).
[36]
B. G. Witmer and M. J. Singer, “Measuring presence in virtual environments: A presence questionnaire,” Presence, vol. 7, no. 3, pp. 225–240,1998.
[37]
A. Nemcová, L. Maršánová, and R. Smíšek, “Recommendations for ECG acquisition using BITalino,” in Conference Paper EEICT, vol. 1, 2016, pp.543–47.
[38]
T. McMahan, I. Parberry, and T. D. Parsons, “Evaluating player task engagement and arousal using electroencephalography,” Procedia Manufacturing, vol. 3, pp. 2303–2310, 2015.
[39]
A. M. Lund, “Measuring usability with the USE questionnaire,” Usability Interface, vol. 8, no. 2, pp.3–6, 2001.
[40]
C. Harmon-Jones, B. Bastian, and E. Harmon-Jones, “The Discrete Emotions Questionnaire: A New Tool for Measuring State Self-reported Emotions,” PloS one, vol. 11, no. 8, p. e0159915, 2016.
[41]
Pedro, T. M. S., & Silva, J. L. (2021). Towards higher sense of presence: a 3D virtual environment adaptable to confusion and engagement. IEEE Access, 9, 8455-8470.
[42]
Shao, S., Zhou, Q., & Liu, Z. (2021). Study of mental workload imposed by different tasks based on teleoperation. International Journal of Occupational Safety and Ergonomics, 27(4), 979-989.
[43]
Lavie, T., & Meyer, J. (2010). Benefits and costs of adaptive user interfaces. International Journal of Human-Computer Studies, 68(8), 508-524.
[44]
Tchankue, P., Wesson, J., & Vogts, D. (2011, November). The impact of an adaptive user interface on reducing driver distraction. In Proceedings of the 3rd international conference on automotive user interfaces and interactive vehicular applications (pp. 87-94).
[45]
DeGuzman, C. A., Kanaan, D., & Donmez, B. (2022). Attentive user interfaces: Adaptive interfaces that monitor and manage driver attention. In User Experience Design in the Era of Automated Driving (pp. 305-334). Springer, Cham.
[46]
Singh, G., Chanel, C. P., & Roy, R. N. (2021). Mental workload estimation based on physiological features for pilot-UAV teaming applications. Frontiers in Human Neuroscience, 15.
[47]
Silva, J. L., Ornelas, J. D., & Silva, J. C. (2016, June). Make it ISI: interactive systems integration tool. In Proceedings of the 8th ACM SIGCHI Symposium on Engineering Interactive Computing Systems (pp. 245-250).
[48]
Maia, R., Silva, J. C., & Silva, J. L. (2019, October). Towards graphical user interface redefinition without source code access: System design and evaluation. In Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services (pp. 1-6).
[49]
Alarcao, S. M., & Fonseca, M. J. (2017). Emotions recognition using EEG signals: A survey. IEEE Transactions on Affective Computing, 10(3), 374-393.
[50]
Rahman, M. M., Sarkar, A. K., Hossain, M. A., Hossain, M. S., Islam, M. R., Hossain, M. B., ... & Moni, M. A. (2021). Recognition of human emotions using EEG signals: A review. Computers in Biology and Medicine, 136, 104696.
[51]
Heard, J., Harriott, C. E., & Adams, J. A. (2018). A survey of workload assessment algorithms. IEEE Transactions on Human-Machine Systems, 48(5), 434-451.
[52]
Debie, E., Rojas, R. F., Fidock, J., Barlow, M., Kasmarik, K., Anavatti, S., ... & Abbass, H. A. (2019). Multimodal fusion for objective assessment of cognitive workload: a review. IEEE transactions on cybernetics, 51(3), 1542-1555.
[53]
Roldán, J. J., Peña-Tapia, E., Garcia-Aunon, P., Del Cerro, J., & Barrientos, A. (2019). Bringing adaptive and immersive interfaces to real-world multi-robot scenarios: Application to surveillance and intervention in infrastructures. Ieee Access, 7, 86319-86335.

Cited By

View all
  • (2023)Driving with Black Box Assistance: Teleoperated Driving Interface Design Guidelines for Computational Driver Assistance Systems in Unstructured EnvironmentsProceedings of the 15th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3580585.3607169(156-166)Online publication date: 18-Sep-2023
  • (2023)Teleoperation in the Age of Mixed Reality: VR, AR, and ROS Integration for Human-Robot Direct Interaction2023 4th International Conference on Electronics and Sustainable Communication Systems (ICESC)10.1109/ICESC57686.2023.10193567(240-245)Online publication date: 6-Jul-2023

Index Terms

  1. Physiologically Attentive User Interface for Improved Robot Teleoperation

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    IUI '23: Proceedings of the 28th International Conference on Intelligent User Interfaces
    March 2023
    972 pages
    ISBN:9798400701061
    DOI:10.1145/3581641
    Publication rights licensed to ACM. ACM acknowledges that this contribution was authored or co-authored by an employee, contractor or affiliate of a national government. As such, the Government retains a nonexclusive, royalty-free right to publish or reproduce this article, or to allow others to do so, for Government purposes only.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 27 March 2023

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Attentive User Interface
    2. Human-Robot Interaction
    3. Mental State Classification
    4. Neural Networks
    5. Recycling User Interfaces
    6. Robot Teleoperation

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    • Fundação para a Ciência e a Tecnologia

    Conference

    IUI '23
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 746 of 2,811 submissions, 27%

    Upcoming Conference

    IUI '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)308
    • Downloads (Last 6 weeks)37
    Reflects downloads up to 01 Sep 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Driving with Black Box Assistance: Teleoperated Driving Interface Design Guidelines for Computational Driver Assistance Systems in Unstructured EnvironmentsProceedings of the 15th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3580585.3607169(156-166)Online publication date: 18-Sep-2023
    • (2023)Teleoperation in the Age of Mixed Reality: VR, AR, and ROS Integration for Human-Robot Direct Interaction2023 4th International Conference on Electronics and Sustainable Communication Systems (ICESC)10.1109/ICESC57686.2023.10193567(240-245)Online publication date: 6-Jul-2023

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media