Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3290605.3300815acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

TrackCap: Enabling Smartphones for 3D Interaction on Mobile Head-Mounted Displays

Published: 02 May 2019 Publication History
  • Get Citation Alerts
  • Abstract

    The latest generation of consumer market Head-mounted displays (HMD) now include self-contained inside-out tracking of head motions, which makes them suitable for mobile applications. However, 3D tracking of input devices is either not included at all or requires to keep the device in sight, so that it can be observed from a sensor mounted on the HMD. Both approaches make natural interactions cumbersome in mobile applications. TrackCap, a novel approach for 3D tracking of input devices, turns a conventional smartphone into a precise 6DOF input device for an HMD user. The device can be conveniently operated both inside and outside the HMD's field of view, while it provides additional 2D input and output capabilities.

    Supplementary Material

    MP4 File (paper585p.mp4)
    Preview video
    MP4 File (pn5492.mp4)
    Supplemental video
    MP4 File (paper585.mp4)

    References

    [1]
    2011. ISO 9241--420:2011: Ergonomics of human-system interaction -- Part 420: Selection of physical input devices, International Standard, International Organization for Standardization.
    [2]
    Christoph W. Borst and Arun P. Indugula. 2005. Realistic Virtual Grasping. In Proc. of IEEE VR. 91--98.
    [3]
    Doug A. Bowman and Larry F. Hodges. 1997. An Evaluation of Techniques for Grabbing and Manipulating Remote Objects in Immersive Virtual Environments. In Proc. of I3D. 35--ff.
    [4]
    Nathan Cournia, John D. Smith, and Andrew T. Duchowski. 2003. Gazevs. Hand-based Pointing in Virtual Environments. In Proceedings of CHI Extended Abstracts. 772--773.
    [5]
    Catherine O. Fritz, Peter E. Morris, and Jennifer J. Richler. 2012. Effect size estimates: Current use, calculations, and interpretation. Journal of Experimental Psychology: General 141, 1 (2012), 2--18.
    [6]
    Chris Harrison, Desney Tan, and Dan Morris. 2010. Skinput: Appropriating the Body As an Input Surface. In Proc. of CHI. 453--462.
    [7]
    Sandra G Hart and Lowell E Staveland. 1988. Development of NASATLX (Task Load Index): Results of empirical and theoretical research. Advances in psychology 52 (1988), 139--183.
    [8]
    Juan David Hincapié-Ramos, Kasim Ozacar, Pourang P. Irani, and Yoshifumi Kitamura. 2015. GyroWand: IMU-based Raycasting for Augmented Reality Head-Mounted Displays. In Proceedings of the 3rd ACM Symposium on Spatial User Interaction. 89--98.
    [9]
    Ken Hinckley, Mike Sinclair, Erik Hanson, Richard Szeliski, and Matt Conway. 1999. The VideoMouse: A Camera-based Multi-degree-offreedom Input Device. In Proceedings of the 12th Annual ACM Symposium on User Interface Software and Technology (UIST '99). 103--112.
    [10]
    Teresa Hirzle, Jan Rixen, Jan Gugenheimer, and Enrico Rukzio. 2018. WatchVR: Exploring the Usage of a Smartwatch for Interaction in Mobile Virtual Reality. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems (CHI EA '18). ACM, Article LBW634, LBW634:1--LBW634:6 pages.
    [11]
    Daniel Holz, Sebastian Ullrich, Marc Wolter, and Torsten Kuhlen. 2008. Multi-Contact Grasp Interaction for Virtual Environments. JVRB Journal of Virtual Reality and Broadcasting 5(2008), 7 (2008).
    [12]
    Jan Jacobs and Bernd Froehlich. 2011. A soft hand model for physicallybased manipulation of virtual objects. In IEEE VR. 11--18.
    [13]
    Jorge Jimenez, Diego Gutierrez, and Pedro Latorre. 2008. Gaze-based Interaction for Virtual Environments. j-jucs 14, 19 (2008), 3085--3098.
    [14]
    T. Karitsuka and K. Sato. 2003. A wearable mixed reality with an onboard projector. In The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings. 321--322.
    [15]
    Daniel Kharlamov, Brandon Woodard, Liudmila Tahai, and Krzysztof Pietroszek. 2016. TickTockRay: Smartwatch-based 3D Pointing for Smartphone-based Virtual Reality. In Proceedings of the 22Nd ACM Conference on Virtual Reality Software and Technology. ACM, 365--366.
    [16]
    Volodymyr V. Kindratenko. 2000. A survey of electromagnetic position tracker calibration techniques. Virtual Reality 5, 3 (Sep 2000), 169--182.
    [17]
    Tobias Langlotz, Elias Tappeiner, Stefanie Zollmann, Jonathan Ventura, and Holger Regenbrecht. 2018. Urban Pointing: Browsing Situated Media Using Accurate Pointing Interfaces. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, LBW604:1--LBW604:6.
    [18]
    J.J. LaViola, E. Kruijff, D.A. Bowman, I.P. Poupyrev, and R.P. McMahan. 2017. 3D User Interfaces: Theory and Practice. Pearson Education, Limited.
    [19]
    V. Lepetit, F.Moreno-Noguer, and P.Fua. 2009. EPnP: An Accurate O(n) Solution to the PnP Problem. International Journal Computer Vision 81, 2 (2009).
    [20]
    Paul Lubos, Gerd Bruder, and Frank Steinicke. 2014. Analysis of Direct Selection in Head-Mounted Display Environments. In IEEE 3DUI. 11-- 18.
    [21]
    Pranav Mistry and Pattie Maes. 2009. SixthSense: A Wearable Gestural Interface. In ACM SIGGRAPH ASIA Sketches. Article 11, 1 pages.
    [22]
    Gerhard Reitmayr, Chris Chiu, Alexander Kusternig, Michael Kusternig, and Hannes Witzmann. 2005. iOrb - Unifying Command and 3D Input for Mobile Augmented Reality. In Proc. IEEE Virtual Reality Workshop on New Diretions in 3D User Interfaces.
    [23]
    Michael Rohs. 2005. Real-World Interaction with Camera Phones. In Ubiquitous Computing Systems, Hitomi Murakami, Hideyuki Nakashima, Hideyuki Tokuda, and Michiaki Yasumura (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 74--89.
    [24]
    Jeff Sauro and Joseph S. Dumas. 2009. Comparison of Three Onequestion, Post-task Usability Questionnaires. In ACM CHI. 1599--1608.
    [25]
    Anthony Talvas, Maud Marchal, and Anatole Lécuyer. 2013. The god finger method for improving 3D interaction with virtual objects through simulation of contact area. In 3DUI. IEEE, 111--114.
    [26]
    Vildan Tanriverdi and Robert J. K. Jacob. 2000. Interacting with Eye Movements in Virtual Environments. In Proc. of CHI. 265--272.
    [27]
    Jonathan Taylor, Lucas Bordeaux, Thomas Cashman, Bob Corish, Cem Keskin, Toby Sharp, Eduardo Soto, David Sweeney, Julien Valentin, Benjamin Luff, Arran Topalian, Erroll Wood, Sameh Khamis, Pushmeet Kohli, Shahram Izadi, Richard Banks, Andrew Fitzgibbon, and Jamie Shotton. 2016. Efficient and Precise Interactive Hand Tracking Through Joint, Continuous Optimization of Pose and Correspondences. ACM Trans. Graph. 35, 4, Article 143 (July 2016), 12 pages.
    [28]
    Robert J Teather and Wolfgang Stuerzlinger. 2014. Visual aids in 3D point selection experiments. In Proc. of SUI. 127--136.
    [29]
    Lode Vanacken, Tovi Grossman, and Karin Coninx. 2007. Exploring the Effects of Environment Density and Target Visibility on Object Selection in 3D Virtual Environments. In 3DUI. 27.
    [30]
    Greg Welch, Gary Bishop, Leandra Vicci, Stephen Brumback, Kurtis Keller, and D'nardo Colucci. 1999. The HiBall Tracker: Highperformance Wide-area Tracking for Virtual and Augmented Environments. In Proceedings of ACM VRST. 1--ff.
    [31]
    Andrew D Wilson and Hrvoje Benko. 2010. Combining multiple depth cameras and projectors for interactions on, above and between surfaces. In Proc. of UIST'10. 273--282.
    [32]
    G. Yamamoto. 2007. A PALM Interface with Projector-Camera System. UbiComp 2007 Adjunct Proceedings (2007), 276--279.
    [33]
    Xing-Dong Yang, Edward Mak, David McCallum, Pourang Irani, Xiang Cao, and Shahram Izadi. 2010. LensMouse: Augmenting the Mouse with an Interactive Touch Display. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '10). 2431--2440.
    [34]
    TS Young, RJ Teather, and I Scott Mackenzie. 2017. An arm-mounted inertial controller for 6DOF input: Design and evaluation. Proc. of 3DUI (2017).

    Cited By

    View all
    • (2024)SmartVR Pointer: Using Smartphones and Gaze Orientation for Selection and Navigation in Virtual RealitySensors10.3390/s2416516824:16(5168)Online publication date: 10-Aug-2024
    • (2024)Unveiling the Invisible: Interactive Spatial Sensing Transforms Air Flow MeasurementACM SIGGRAPH 2024 Immersive Pavilion10.1145/3641521.3664408(1-2)Online publication date: 27-Jul-2024
    • (2024)Screen Augmentation Technique Using AR Glasses and Smartphone without External SensorsExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3648664(1-5)Online publication date: 11-May-2024
    • Show More Cited By

    Index Terms

    1. TrackCap: Enabling Smartphones for 3D Interaction on Mobile Head-Mounted Displays

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems
      May 2019
      9077 pages
      ISBN:9781450359702
      DOI:10.1145/3290605
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 02 May 2019

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. 3d pointing
      2. augmented reality
      3. hmd
      4. input devices
      5. mixed reality
      6. mobile devices
      7. wearable computing

      Qualifiers

      • Research-article

      Funding Sources

      • Ősterreichische Forschungsfurderungsgesellschaft mbH (FFG)
      • Competence Center VRVis

      Conference

      CHI '19
      Sponsor:

      Acceptance Rates

      CHI '19 Paper Acceptance Rate 703 of 2,958 submissions, 24%;
      Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)163
      • Downloads (Last 6 weeks)13
      Reflects downloads up to 10 Aug 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)SmartVR Pointer: Using Smartphones and Gaze Orientation for Selection and Navigation in Virtual RealitySensors10.3390/s2416516824:16(5168)Online publication date: 10-Aug-2024
      • (2024)Unveiling the Invisible: Interactive Spatial Sensing Transforms Air Flow MeasurementACM SIGGRAPH 2024 Immersive Pavilion10.1145/3641521.3664408(1-2)Online publication date: 27-Jul-2024
      • (2024)Screen Augmentation Technique Using AR Glasses and Smartphone without External SensorsExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3648664(1-5)Online publication date: 11-May-2024
      • (2024)PhoneInVR: An Evaluation of Spatial Anchoring and Interaction Techniques for Smartphone Usage in Virtual RealityProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642582(1-16)Online publication date: 11-May-2024
      • (2024)Low-Fi VR Controller: Bringing 6DOF Interaction to Mobile VR2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00378(1168-1169)Online publication date: 16-Mar-2024
      • (2024)[DC] Supporting Complex Interactions in Mobile Virtual Reality2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00363(1138-1139)Online publication date: 16-Mar-2024
      • (2024)Improving Inclusion of Virtual Reality Through Enhancing Interactions in Low-Fidelity VR2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00061(305-310)Online publication date: 16-Mar-2024
      • (2024)Design Patterns for Situated Visualization in Augmented RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.332739830:1(1324-1335)Online publication date: 1-Jan-2024
      • (2024)Text Entry Performance and Situation Awareness of a Joint Optical See-Through Head-Mounted Display and Smartphone SystemIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.330931630:8(5830-5846)Online publication date: Aug-2024
      • (2023)Balancing Accuracy and Speed in Gaze-Touch Grid Menu Selection in AR via Mapping Sub-Menus to a Hand-Held DeviceSensors10.3390/s2323958723:23(9587)Online publication date: 3-Dec-2023
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media