Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3605390.3605411acmotherconferencesArticle/Chapter ViewAbstractPublication PageschitalyConference Proceedingsconference-collections
short-paper

Evaluation of a Multimodal Interaction System for Big Displays

Published: 20 September 2023 Publication History

Abstract

Big displays (ultrawalls) are increasingly present in professional, urban and leisure environments. Traditionally used as visualization tools, large-size screen surfaces may expand their interaction capabilities when integrated with sensors and mobile devices. This article describes a system that enables both voice and gesture control and mobile-mediated interaction with multiple information streams in a big display. The system offers support to a visualization model based on tiles and predefined layouts, which serves to handle the information streams and services. A set of specific commands combining voice and gesture control are preliminarily proposed. The vocabulary is enough to configure a command and monitoring service for fleets of drones. On this service scenario, the paper describes a study with ten users that has been carried out to verify that multimodal interaction can be efficient and perceived as acceptable, in particular when compared to mobile-enabled interaction. Results show that both proposals may deliver similar efficiency for simple tasks. Users are not reluctant to use multimodality, although they prefer voice over gestures and suggest combining both mobile and multimodal approaches.

References

[1]
Belkacem, I., Tominski, C., Médoc, N., Knudsen, S., Dachselt, R., & Ghoniem, M. (2022). Interactive Visualization on Large High-Resolution Displays: A Survey. arXiv preprint arXiv:2212.04346.
[2]
Nigay, L., & Coutaz, J. (1993, May). A design space for multimodal systems: concurrent processing and data fusion. In Proceedings of the INTERACT'93 and CHI'93 conference on Human factors in computing systems (pp. 172-178). ACM.
[3]
Alibay, F., Kavakli, M., Chardonnet, J. R., & Baig, M. Z. (2017, February). The Usability of Speech and/or Gestures in Multi-Modal Interface Systems. In Proceedings of the 9th International Conference on Computer and Automation Engineering (pp. 73-77). ACM.
[4]
Fernández, A., Bergesio, L., Bernardos, A. M., Besada, J. A., & Casar, J. R. (2015, April). A Kinect-based system to enable interaction by pointing in smart spaces. In Sensors Applications Symposium (SAS), (pp. 1-6). IEEE.
[5]
Neßelrath, R., Moniri, M. M., & Feld, M. (2016, September). Combining Speech, Gaze, and Micro-gestures for the Multimodal Control of In-Car Functions. In Intelligent Environments (IE), 2016 12th International Conference on (pp. 190-193). IEEE.
[6]
Zhang, N., Wang, W. X., Huang, S. Y., & Luo, R. M. (2022). Mid-air gestures for in-vehicle media player: elicitation, segmentation, recognition, and eye-tracking testing. SN Applied Sciences, 4(4), 109.
[7]
CNET. (2015, Jan.). VW shows the car of tomorrow with Golf R Touch https://www.youtube.com/watch?v=WOOdXmm6OC0 [On line] (2023, April).
[8]
TestDriven. (2021, November). BMW Gesture Control https://www.youtube.com/watch?v=Ilh9w_K7LHE [On line] (2023, April)
[9]
Kivimäki, T., Vuorela, T., Valtonen, M., & Vanhala, J. (2013, July). Gesture Control System for Smart Environments. In Intelligent Environments (IE), 2013 9th International Conference on (pp. 232-235). IEEE.
[10]
Oviatt, S., Coulston, R., & Lunsford, R. (2004, October). When do we interact multimodally?: cognitive load and multimodal communication patterns. In Proceedings of the 6th international conference on Multimodal interfaces (pp. 129-136). ACM.
[11]
Kang, R., Guo, A., Laput, G., Li, Y., & Chen, X. A. (2019, October). Minuet: Multimodal interaction with an internet of things. In Symposium on spatial user interaction (pp. 1-10).
[12]
Fakhrurroja, H., Machbub, C., & Prihatmanto, A. S. (2020). Multimodal Interaction System for Home Appliances Control. International Journal of Interactive Mobile Technologies, 14(15). Liang, Z., Xu, X., & Qing, C. (2017). Natural Interaction in Freehand Pointing: Designing the mapping of interaction space. IEEE Consumer Electronics Magazine, 6(2), 89-93.
[13]
Liang, Z., Xu, X., & Qing, C. (2017). Natural Interaction in Freehand Pointing: Designing the mapping of interaction space. IEEE Consumer Electronics Magazine, 6(2), 89-93.
[14]
Du, G., Degbelo, A., & Kray, C. (2017, June). Public displays for public participation in urban settings: a survey. In Proceedings of the 6th ACM International Symposium on Pervasive Displays (p. 17). ACM.
[15]
Williamson, J. R., Williamson, J., & Sundén, D. (2017, June). Sunken ripples: exploring performative interactions with non-planar displays. In Proceedings of the 6th ACM International Symposium on Pervasive Displays (p. 12). ACM.
[16]
Pietriga, E. (2022, June). Engineering Interactive Geospatial Visualizations for Cluster-Driven Ultra-high-resolution Wall Displays. In Companion of the 2022 ACM SIGCHI Symposium on Engineering Interactive Computing Systems (pp. 3-4).
[17]
Nancel, M., Pietriga, E., Chapuis, O., & Beaudouin-Lafon, M. (2015). Mid-air pointing on ultra-walls. ACM Transactions on Computer-Human Interaction (TOCHI), 22(5), 21.
[18]
Vogel, D., & Balakrishnan, R. (2005, October). Distant freehand pointing and clicking on very large, high resolution displays. In Proceedings of the 18th annual ACM symposium on User interface software and technology (pp. 33-42). ACM.
[19]
Bernardos, A. M., Muñoz, A., Bergesio, L., Besada, J. A., & Casar, J. R. (2017, June). A multimodal interaction system for big displays. In Proceedings of the 6th ACM International Symposium on Pervasive Displays (p. 28). ACM.
[20]
James, R., Bezerianos, A., & Chapuis, O. (2023, April). Evaluating the Extension of Wall Displays with AR for Collaborative Work. In International conference on Human factors in computing systems (CHI 2023). ACM.
[21]
Mehler, A., & Lücking, A. (2012, November). WikiNect: towards a gestural writing system for kinetic museum wikis. In Proceedings of the 2012 ACM workshop on User experience in e-learning and augmented technologies in education (pp. 7-12). ACM.
[22]
Almeida, N., Silva, S., Santos, B. S., & Teixeira, A. (2016, July). Interactive, multi-device visualization supported by a multimodal interaction framework: Proof of concept. In International Conference on Human Aspects of IT for the Aged Population (pp. 279-289). Springer Intl. Publishing.

Index Terms

  1. Evaluation of a Multimodal Interaction System for Big Displays
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    CHItaly '23: Proceedings of the 15th Biannual Conference of the Italian SIGCHI Chapter
    September 2023
    416 pages
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 20 September 2023

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Big displays
    2. Gesture recognition
    3. Mobile-mediated interaction
    4. Multimodal interaction

    Qualifiers

    • Short-paper
    • Research
    • Refereed limited

    Funding Sources

    Conference

    CHItaly 2023

    Acceptance Rates

    Overall Acceptance Rate 109 of 242 submissions, 45%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 50
      Total Downloads
    • Downloads (Last 12 months)40
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 01 Nov 2024

    Other Metrics

    Citations

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media