Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2501988.2502042acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

Mime: compact, low power 3D gesture sensing for interaction with head mounted displays

Published: 08 October 2013 Publication History

Abstract

We present Mime, a compact, low-power 3D sensor for unencumbered free-form, single-handed gestural interaction with head-mounted displays (HMDs). Mime introduces a real-time signal processing framework that combines a novel three-pixel time-of-flight (TOF) module with a standard RGB camera. The TOF module achieves accurate 3D hand localization and tracking, and it thus enables motion-controlled gestures. The joint processing of 3D information with RGB image data enables finer, shape-based gestural interaction.
Our Mime hardware prototype achieves fast and precise 3D gestural control. Compared with state-of-the-art 3D sensors like TOF cameras, the Microsoft Kinect and the Leap Motion Controller, Mime offers several key advantages for mobile applications and HMD use cases: very small size, daylight insensitivity, and low power consumption. Mime is built using standard, low-cost optoelectronic components and promises to be an inexpensive technology that can either be a peripheral component or be embedded within the HMD unit. We demonstrate the utility of the Mime sensor for HMD interaction with a variety of application scenarios, including 3D spatial input using close-range gestures, gaming, on-the-move interaction, and operation in cluttered environments and in broad daylight conditions.

Supplementary Material

suppl.mov (uist406.m4v)
Supplemental video

References

[1]
Bailly, G., Müller, J., Rohs, M., Wigdor, D., and Kratz, S. ShoeSense: A new perspective on gestural interaction and wearable applications. In Proc. ACM Ann. Conf. Human Factors in Comput. Syst. (2012).
[2]
Chen, X. A., Marquardt, N., Tang, A., Boring, S., and Greenberg, S. Extending a mobile device's interaction space through body-centric interaction. In Proc. 14th ACM Int. Conf. Human-Computer Interaction with Mobile Devices and Services (2012).
[3]
Fredembach, C., Barbuscia, N., and Süsstrunk, S. Combining visible and near-infrared images for realistic skin smoothing. In Proc. IS&T/SID 17th Color Imaging Conf. (2009).
[4]
Gustafson, S., Bierwirth, D., and Baudisch, P. Imaginary interfaces: Spatial interaction with empty hands and without visual feedback. In Proc. ACM Symp. User Interface Softw. Tech. (2010).
[5]
Harrison, C., Benko, H., and Wilson, A. D. OmniTouch: Wearable multitouch interaction everywhere. In Proc. 24th Ann. ACM Symp. User Interface Softw. Tech. (2011).
[6]
Ho, K. C., Lu, X., and Kovavisaruch, L. Source localization using TDOA and FDOA measurements in the presence of receiver location errors: Analysis and solution. IEEE Trans. Signal Process. (2007).
[7]
Hong, Y., Lee, S., Lee, Y., and Kim, S. Mobile pointing and input system using active marker. In Proc. IEEE/ACM Int. Symp. Mixed & Augmented Reality (2006).
[8]
Jones, B., Sodhi, R., Forsyth, D., Bailey, B., and Maciocci, G. Around device interaction for multiscale navigation. In Proc. Conf. HCI with Mobile Devices and Services (2012).
[9]
Kim, D., Hilliges, O., Izadi, S., Butler, A., Chen, J., Oikonomidis, I., and Olivier, P. Digits: Freehand 3D interactions anywhere using a wrist-worn gloveless sensor. In ACM UIST (2012).
[10]
Kirmani, A., Colaço, A., Wong, F. N. C., and Goyal, V. K. Exploiting sparsity in time-of-flight range acquisition using a single time-resolved sensor. Opt. Expr. 19, 22 (Oct. 2011), 21485--21507.
[11]
Lange, R., and Seitz, P. Solid-state time-of-flight range camera. IEEE J. Quant. Electron. 37, 3 (2001), 390--397.
[12]
Lyons, K., Starner, T., Plaisted, D., Fusia, J., Lyons, A., Drew, A., and Looney, E. W. Twiddler typing: One-handed chording text entry for mobile phones. In SIGCHI Human Factors in Comput. Syst. (2004).
[13]
Mitra, S., and Acharya, T. Gesture recognition: A survey. IEEE Trans. Syst., Man, Cybernetics, Part C: Appl. Rev. 37, 3 (2007), 311--324.
[14]
Nelson, S. A., Lee, J. C., and Helgeson, M. A. Handheld computer apparatus. US Patent 6,911,969, 2005.
[15]
Newman, R. L., and Haworth, L. A. Flight displays II: Head-up and helmet-mounted displays. Spatial Disorientation in Aviation (2004).
[16]
Piekarski, W., and Thomas, B. ARQuake: The outdoor augmented reality gaming system. Commun. ACM 45, 1 (2002), 36--38.
[17]
Smith, R., Piekarski, W., and Wigley, G. Hand tracking for low powered mobile AR user interfaces. In Proc. Sixth Australasian Conf. User Interface (2005).
[18]
Starner, T., Auxier, J., Ashbrook, D., and Gandy, M. The gesture pendant: A self-illuminating, wearable, infrared computer vision system for home automation control and medical monitoring. In IEEE Int. Symp. Wearable Comput. (2000).
[19]
Szeliski, R. Computer Vision: Algorithms and Applications. 2010.
[20]
Thomas, B. H., and Piekarski, W. Glove based user interaction techniques for augmented reality in an outdoor environment. Virtual Reality 6, 3 (2002), 167--180.

Cited By

View all
  • (2024)Exploring the Design Space of Input Modalities for Working in Mixed Reality on Long-haul FlightsProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661560(2267-2285)Online publication date: 1-Jul-2024
  • (2024)TriPad: Touch Input in AR on Ordinary Surfaces with Hand Tracking OnlyProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642323(1-18)Online publication date: 11-May-2024
  • (2023)Evaluating design guidelines for hand proximate user interfacesProceedings of the 2023 ACM Designing Interactive Systems Conference10.1145/3563657.3596117(1159-1173)Online publication date: 10-Jul-2023
  • Show More Cited By

Index Terms

  1. Mime: compact, low power 3D gesture sensing for interaction with head mounted displays

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '13: Proceedings of the 26th annual ACM symposium on User interface software and technology
    October 2013
    558 pages
    ISBN:9781450322683
    DOI:10.1145/2501988
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 08 October 2013

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. 3d sensing
    2. gesture sensing
    3. glasses
    4. hand tracking
    5. head mounted displays
    6. mobile
    7. time-of- flight imaging

    Qualifiers

    • Research-article

    Conference

    UIST'13
    UIST'13: The 26th Annual ACM Symposium on User Interface Software and Technology
    October 8 - 11, 2013
    St. Andrews, Scotland, United Kingdom

    Acceptance Rates

    UIST '13 Paper Acceptance Rate 62 of 317 submissions, 20%;
    Overall Acceptance Rate 561 of 2,567 submissions, 22%

    Upcoming Conference

    UIST '25
    The 38th Annual ACM Symposium on User Interface Software and Technology
    September 28 - October 1, 2025
    Busan , Republic of Korea

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)39
    • Downloads (Last 6 weeks)5
    Reflects downloads up to 25 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Exploring the Design Space of Input Modalities for Working in Mixed Reality on Long-haul FlightsProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661560(2267-2285)Online publication date: 1-Jul-2024
    • (2024)TriPad: Touch Input in AR on Ordinary Surfaces with Hand Tracking OnlyProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642323(1-18)Online publication date: 11-May-2024
    • (2023)Evaluating design guidelines for hand proximate user interfacesProceedings of the 2023 ACM Designing Interactive Systems Conference10.1145/3563657.3596117(1159-1173)Online publication date: 10-Jul-2023
    • (2023)TicTacToes: Assessing Toe Movements as an Input ModalityProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580954(1-17)Online publication date: 19-Apr-2023
    • (2022)StretchARProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35503056:3(1-26)Online publication date: 7-Sep-2022
    • (2022)Kuiper Belt: Utilizing the “Out-of-natural Angle” Region in the Eye-gaze Interaction for Virtual RealityProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517725(1-17)Online publication date: 29-Apr-2022
    • (2021)A Human-Centered Assembly Workplace For Industry: Challenges and Lessons LearnedProcedia Computer Science10.1016/j.procs.2021.01.166180(290-300)Online publication date: 2021
    • (2020)“Blurry Touch Finger”: Touch-Based Interaction for Mobile Virtual Reality with Clip-on LensesApplied Sciences10.3390/app1021792010:21(7920)Online publication date: 8-Nov-2020
    • (2020)Walk The Line: Leveraging Lateral Shifts of the Walking Path as an Input Modality for Head-Mounted DisplaysProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376852(1-15)Online publication date: 21-Apr-2020
    • (2020)Around-Body Interaction: Interacting While on the GoIEEE Pervasive Computing10.1109/MPRV.2020.297785019:2(74-78)Online publication date: 1-Apr-2020
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media