Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3173574.3173993acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Designing Consistent Gestures Across Device Types: Eliciting RSVP Controls for Phone, Watch, and Glasses

Published: 21 April 2018 Publication History

Abstract

In the era of ubiquitous computing, people expect applications to work across different devices. To provide a seamless user experience it is therefore crucial that interfaces and interactions are consistent across different device types. In this paper, we present a method to create gesture sets that are consistent and easily transferable. Our proposed method entails 1) the gesture elicitation on each device type, 2) the consolidation of a unified gesture set, and 3) a final validation by calculating a transferability score. We tested our approach by eliciting a set of user-defined gestures for reading with Rapid Serial Visual Presentation (RSVP) of text for three device types: phone, watch, and glasses. We present the resulting, unified gesture set for RSVP reading and show the feasibility of our method to elicit gesture sets that are consistent across device types with different form factors.

Supplementary Material

suppl.mov (pn3499-file5.mp4)
Supplemental video

References

[1]
David Akers. 2006. Wizard of Oz for Participatory Design: Inventing a Gestural Interface for 3D Selection of Neural Pathway Estimates. In CHI '06 Extended Abstracts on Human Factors in Computing Systems (CHI EA '06). ACM, New York, NY, USA, 454--459.
[2]
Lisa Anthony, Radu-Daniel Vatavu, and Jacob O. Wobbrock. 2013. Understanding the Consistency of Users' Pen and Finger Stroke Gesture Articulation. In Proceedings of Graphics Interface 2013 (GI '13). Canadian Information Processing Society, Toronto, Ont., Canada, Canada, 87--94. http://dl.acm.org/citation.cfm?id=2532129.2532145
[3]
Richard A. Bolt. 1980. "Put-that-there": Voice and Gesture at the Graphics Interface. In Proceedings of the 7th Annual Conference on Computer Graphics and Interactive Techniques (SIGGRAPH '80). ACM, New York, NY, USA, 262--270.
[4]
Tilman Dingler, Alireza Sahami Shirazi, Kai Kunze, and Albrecht Schmidt. 2015. Assessment of Stimuli for Supporting Speed Reading on Electronic Devices. In Proceedings of the 6th Augmented Human International Conference (AH '15). ACM, New York, NY, USA, 117--124.
[5]
Tanja Döring, Dagmar Kern, Paul Marshall, Max Pfeiffer, Johannes Schöning, Volker Gruhn, and Albrecht Schmidt. 2011. Gestural Interaction on the Steering Wheel: Reducing the Visual Demand. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11). ACM, New York, NY, USA, 483--492.
[6]
Robert L Duchnicky and Paul A Kolers. 1983. Readability of text scrolled on visual display terminals as a function of window size. Human Factors: The Journal of the Human Factors and Ergonomics Society 25, 6 (1983), 683--692.
[7]
David Efron. 1941. Gesture and environment. (1941).
[8]
Kenneth I Forster. 1970. Visual perception of rapidly presented word sequences of varying complexity. 8, 4 (01 Jul 1970), 215--221.
[9]
Tsvetozar Georgiev. 2012. Investigation of the User's Text Reading Speed on Mobile Devices. In Proceedings of the 13th International Conference on Computer Systems and Technologies (CompSysTech '12). ACM, New York, NY, USA, 329--336.
[10]
Michael D. Good, John A. Whiteside, Dennis R. Wixon, and Sandra J. Jones. 1984. Building a User-derived Interface. Commun. ACM 27, 10 (Oct. 1984), 1032--1043.
[11]
Björn Hedin and Erik Lindgren. 2007. A Comparison of Presentation Methods for Reading on Mobile Phones. Distributed Systems Online, IEEE 8, 6 (June 2007), 2--2.
[12]
Niels Henze, Andreas Löcken, Susanne Boll, Tobias Hesselmann, and Martin Pielot. 2010. Free-hand Gestures for Music Playback: Deriving Gestures with a User-centred Process. In Proceedings of the 9th International Conference on Mobile and Ubiquitous Multimedia (MUM '10). ACM, New York, NY, USA, Article 16, 10 pages.
[13]
Frank G. Hofmann, Peter Heyer, and Günter Hommel. 1998. Velocity profile based recognition of dynamic gestures with discrete Hidden Markov Models. Springer Berlin Heidelberg, Berlin, Heidelberg, 81--95.
[14]
Shaun K. Kane, Jacob O. Wobbrock, and Richard E. Ladner. 2011. Usable Gestures for Blind People: Understanding Preference and Performance. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11). ACM, New York, NY, USA, 413--422.
[15]
Christian Kray, Daniel Nesbitt, John Dawson, and Michael Rohs. 2010. User-defined Gestures for Connecting Mobile Phones, Public Displays, and Tabletops. In Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services (MobileHCI '10). ACM, New York, NY, USA, 239--248.
[16]
Christine Kühnel, Tilo Westermann, Fabian Hemmert, Sven Kratz, Alexander Müller, and Sebastian Möller. 2011. I'm home: Defining and evaluating a gesture set for smart-home control. International Journal of Human-Computer Studies 69, 11 (2011), 693 -- 704.
[17]
Byron Lahey, Audrey Girouard, Winslow Burleson, and Roel Vertegaal. 2011. PaperPhone: Understanding the Use of Bend Gestures in Mobile Devices with Flexible Electronic Paper Displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11). ACM, New York, NY, USA, 1303--1312.
[18]
Joseph Malloch, Carla F. Griggio, Joanna McGrenere, and Wendy E. Mackay. 2017. Fieldward and Pathward: Dynamic Guides for Defining Your Own Gestures. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, New York, NY, USA, 4266--4277.
[19]
Catherine C. Marshall and Christine Ruotolo. 2002. Reading-in-the-small: A Study of Reading on Small Form Factor Devices. In Proceedings of the 2Nd ACM/IEEE-CS Joint Conference on Digital Libraries (JCDL '02). ACM, New York, NY, USA, 56--64.
[20]
Michael E. J. Masson. 1983. Conceptual processing of text during skimming and rapid sequential reading. Memory & Cognition 11, 3 (01 May 1983), 262--274.
[21]
David McNeill. 1992. Hand and mind: What gestures reveal about thought. University of Chicago Press.
[22]
Meredith Ringel Morris, Andreea Danielescu, Steven Drucker, Danyel Fisher, Bongshin Lee, m. c. schraefel, and Jacob O. Wobbrock. 2014. Reducing Legacy Bias in Gesture Elicitation Studies. interactions 21, 3 (May 2014), 40--45.
[23]
Michael Nielsen, Moritz Störring, Thomas B. Moeslund, and Erik Granum. 2004. A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI. Springer Berlin Heidelberg, Berlin, Heidelberg, 409--420.
[24]
Isabella Poggi. 2002. From a Typology of Gestures to a Procedure for Gesture Production. Springer Berlin Heidelberg, Berlin, Heidelberg, 158--168.
[25]
Mary C Potter and Ellen I Levy. 1969. Recognition memory for a rapid sequence of pictures. Journal of experimental psychology 81, 1 (1969), 10.
[26]
Jane E Raymond, Kimron L Shapiro, and Karen M Arnell. 1992. Temporary suppression of visual processing in an RSVP task: An attentional blink? Journal of Experimental Psychology: Human perception and performance 18, 3 (1992), 849.
[27]
Jaime Ruiz, Yang Li, and Edward Lank. 2011. User-defined Motion Gestures for Mobile Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11). ACM, New York, NY, USA, 197--206.
[28]
Thomas Schlömer, Benjamin Poppinga, Niels Henze, and Susanne Boll. 2008. Gesture Recognition with a Wii Controller. In Proceedings of the 2Nd International Conference on Tangible and Embedded Interaction (TEI '08). ACM, New York, NY, USA, 11--14.
[29]
Elizabeth R Schotter, Randy Tran, and Keith Rayner. 2014. Don't Believe What You Read (Only Once) Comprehension Is Supported by Regressions During Reading. Psychological science (2014), 0956797614531148.
[30]
Douglas Schuler and Aki Namioka. 1993. Participatory design: Principles and practices. CRC Press.
[31]
Marcos Serrano, Barrett M. Ens, and Pourang P. Irani. 2014. Exploring the Use of Hand-to-face Input for Interacting with Head-worn Displays. In Proceedings of the 32Nd Annual ACM Conference on Human Factors in Computing Systems (CHI '14). ACM, New York, NY, USA, 3181--3190.
[32]
Shaikh Shawon Arefin Shimon, Courtney Lutton, Zichun Xu, Sarah Morrison-Smith, Christina Boucher, and Jaime Ruiz. 2016. Exploring Non-touchscreen Gestures for Smartwatches. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 3822--3833.
[33]
Ben Shneiderman. 1987. User interface design and evaluation for an electronic encyclopedia. University of Maryland.
[34]
Ben Shneiderman and Catherine Plaisant. 2004. Designing the User Interface: Strategies for Effective Human-Computer Interaction (4th Edition). Pearson Addison Wesley.
[35]
Robert Spence. 2002. Rapid, serial and visual: a presentation technique with potential. Information visualization 1, 1 (2002), 13--19.
[36]
Radu-Daniel Vatavu. 2013. A Comparative Study of User-defined Handheld vs. Freehand Gestures for Home Entertainment Environments. J. Ambient Intell. Smart Environ. 5, 2 (March 2013), 187--211. http://dl.acm.org/citation.cfm?id=2594684.2594688
[37]
Radu-Daniel Vatavu and Jacob O. Wobbrock. 2015. Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, New York, NY, USA, 1325--1334.
[38]
Jacob O. Wobbrock, Htet Htet Aung, Brandon Rothrock, and Brad A. Myers. 2005. Maximizing the Guessability of Symbolic Input. In CHI '05 Extended Abstracts on Human Factors in Computing Systems (CHI EA '05). ACM, New York, NY, USA, 1869--1872.
[39]
Jacob O. Wobbrock, Meredith Ringel Morris, and Andrew D. Wilson. 2009. User-defined Gestures for Surface Computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09). ACM, New York, NY, USA, 1083--1092.

Cited By

View all
  • (2024)Your Eyes on Speed: Using Pupil Dilation to Adaptively Select Speed-Reading Parameters in Virtual RealityProceedings of the ACM on Human-Computer Interaction10.1145/36765318:MHCI(1-17)Online publication date: 24-Sep-2024
  • (2024)Enhancing Mobile Interaction: Practical Insights from Smartphone and Smartwatch IntegrationAdjunct Proceedings of the 26th International Conference on Mobile Human-Computer Interaction10.1145/3640471.3680451(1-9)Online publication date: 21-Sep-2024
  • (2024)RadarHand: A Wrist-Worn Radar for On-Skin Touch-Based Proprioceptive GesturesACM Transactions on Computer-Human Interaction10.1145/361736531:2(1-36)Online publication date: 29-Jan-2024
  • Show More Cited By

Index Terms

  1. Designing Consistent Gestures Across Device Types: Eliciting RSVP Controls for Phone, Watch, and Glasses

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '18: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems
    April 2018
    8489 pages
    ISBN:9781450356206
    DOI:10.1145/3173574
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 21 April 2018

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. consistency
    2. design methods
    3. gesture elicitation
    4. rsvp
    5. transferability

    Qualifiers

    • Research-article

    Funding Sources

    • JST (CREST, Presto)

    Conference

    CHI '18
    Sponsor:

    Acceptance Rates

    CHI '18 Paper Acceptance Rate 666 of 2,590 submissions, 26%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)50
    • Downloads (Last 6 weeks)5
    Reflects downloads up to 02 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Your Eyes on Speed: Using Pupil Dilation to Adaptively Select Speed-Reading Parameters in Virtual RealityProceedings of the ACM on Human-Computer Interaction10.1145/36765318:MHCI(1-17)Online publication date: 24-Sep-2024
    • (2024)Enhancing Mobile Interaction: Practical Insights from Smartphone and Smartwatch IntegrationAdjunct Proceedings of the 26th International Conference on Mobile Human-Computer Interaction10.1145/3640471.3680451(1-9)Online publication date: 21-Sep-2024
    • (2024)RadarHand: A Wrist-Worn Radar for On-Skin Touch-Based Proprioceptive GesturesACM Transactions on Computer-Human Interaction10.1145/361736531:2(1-36)Online publication date: 29-Jan-2024
    • (2024)Controlling the Rooms: How People Prefer Using Gestures to Control Their Smart HomesProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642687(1-18)Online publication date: 11-May-2024
    • (2024)Exploring Methods to Optimize Gesture Elicitation Studies: A Systematic Literature ReviewIEEE Access10.1109/ACCESS.2024.338726912(64958-64979)Online publication date: 2024
    • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
    • (2023)Towards a Consensus Gesture Set: A Survey of Mid-Air Gestures in HCI for Maximized Agreement Across DomainsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581420(1-24)Online publication date: 19-Apr-2023
    • (2023)iFAD Gestures: Understanding Users’ Gesture Input Performance with Index-Finger Augmentation DevicesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580928(1-17)Online publication date: 19-Apr-2023
    • (2023)Sequential Eyelid Gestures for User Interfaces in VR2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW58643.2023.00201(717-718)Online publication date: Mar-2023
    • (2023)Reading and Walking with Smart Glasses: Effects of Display and Control Modes on SafetyInternational Journal of Human–Computer Interaction10.1080/10447318.2023.227652940:23(7875-7891)Online publication date: 7-Nov-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media