Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3290605.3300676acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Augmented Reality Views for Occluded Interaction

Published: 02 May 2019 Publication History

Abstract

We rely on our sight when manipulating objects. When objects are occluded, manipulation becomes difficult. Such occluded objects can be shown via augmented reality to re-enable visual guidance. However, it is unclear how to do so to best support object manipulation. We compare four views of occluded objects and their effect on performance and satisfaction across a set of everyday manipulation tasks of varying complexity. The best performing views were a see-through view and a displaced 3D view. The former enabled participants to observe the manipulated object through the occluder, while the latter showed the 3D view of the manipulated object offset from the object's real location. The worst performing view showed remote imagery from a simulated hand-mounted camera. Our results suggest that alignment of virtual objects with their real-world location is less important than an appropriate point-of-view and view stability.

Supplementary Material

ZIP File (pn4779.zip)
The auxiliary material contains a PDF file with a figure showing participants ratings for every combination of object and visualization.
MP4 File (paper446p.mp4)
Preview video
MP4 File (pn4779.mp4)
Supplemental video
MP4 File (paper446.mp4)

References

[1]
Roland Arsenault and Colin Ware. 2000. Eye-Hand Co-Ordination with Force Feedback. In Proceedings of the SIGCHI conference on Human factors in computing systems - CHI '00. ACM Press, New York, New York, USA, 408--414.
[2]
P. Barnum, Y. Sheikh, A. Datta, and T. Kanade. 2009. Dynamic seethroughs: Synthesizing hidden views of moving objects. In 2009 8th IEEE International Symposium on Mixed and Augmented Reality. 111--114.
[3]
Christoph Bichlmeier, Sandro Michael Heining, Marco Feuerstein, and Nassir Navab. 2009. The virtual mirror: a new interaction paradigm for augmented reality environments. IEEE Transactions on Medical Imaging 28, 9 (2009), 1498--1510.
[4]
Ian M Bullock and Aaron M Dollar. 2011. Classifying human manipulation behavior. In Rehabilitation Robotics (ICORR), 2011 IEEE International Conference on. IEEE, 1--6.
[5]
AshleyColley,OlliKoskenranta,JaniVäyrynen,LeenaVentä-Olkkonen, and Jonna Häkkilä. 2014. Windows to other places: exploring solutions for seeing through walls using handheld projection. In Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational. ACM, 127--136.
[6]
Mustafa Tolga Eren and Selim Balcisoy. 2018. Evaluation of X-ray visualization techniques for vertical depth judgments in underground exploration. The Visual Computer 34, 3 (2018), 405--416.
[7]
Thomas Feix, Ian M Bullock, and Aaron M Dollar. 2014. Analysis of human grasping behavior: Object characteristics and grasp type. IEEE transactions on haptics 7, 3 (2014), 311--323.
[8]
Richard Held, Aglaia Efstathiou, and Martha Greene. 1966. Adaptation to displaced and delayed visual feedback from the hand. Journal of Experimental Psychology 72, 6 (1966), 887.
[9]
Samantha Horvath, John Galeotti, Bing Wu, Roberta Klatzky, Mel Siegel, and George Stetten. 2014. FingerSight: Fingertip Haptic Sensing of the Visual Environment. IEEE Journal of Translational Engineering in Health and Medicine 2 (2014), 1--9.
[10]
Daisuke Iwai and Kosuke Sato. 2011. Document search support by making physical documents transparent in projection-based mixed reality. Virtual reality 15, 2--3 (2011), 147--160.
[11]
LS Jakobson and Melvyn A Goodale. 1989. Trajectories of reaches to prismatically-displaced targets: evidence for "automatic" visuomotor recalibration. Experimental Brain Research 78, 3 (1989), 575--587.
[12]
Bernhard Kainz, Stefan Hauswiesner, Gerhard Reitmayr, Markus Steinberger, Raphael Grasset, Lukas Gruber, Eduardo Veas, Denis Kalkofen, Hartmut Seichter, and Dieter Schmalstieg. 2012. OmniKinect: Real-time Dense Volumetric Data Acquisition and Applications. In Proceedings of the 18th ACM Symposium on Virtual Reality Software and Technology (VRST '12). ACM, New York, NY, USA, 25--32.
[13]
David Kim, Otmar Hilliges, Shahram Izadi, Alex D. Butler, Jiawen Chen, Iason Oikonomidis, and Patrick Olivier. 2012. Digits: Freehand 3D Interactions Anywhere Using a Wrist-worn Gloveless Sensor. In Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology (UIST '12). ACM, New York, NY, USA, 167--176.
[14]
Robert Krempien, Harald Hoppe, Lüder Kahrs, Sascha Daeuber, Oliver Schorr, Georg Eggers, Marc Bischof, Marc W. Munter, Juergen Debus, and Wolfgang Harms. 2008. Projector-Based Augmented Reality for Intuitive Intraoperative Guidance in ImageGuided 3D Interstitial Brachytherapy. International Journal of Radiation Oncology*Biology*Physics 70, 3 (mar 2008), 944--952.
[15]
Takeshi Kurata, Nobuchika Sakata, Masakatsu Kourogi, Hideaki Kuzuoka, and Mark Billinghurst. 2004. Remote collaboration using a shoulder-worn active camera/laser. In Wearable Computers, 2004. ISWC 2004. Eighth International Symposium on, Vol. 1. IEEE, 62--69.
[16]
David Lindlbauer and Andy D. Wilson. 2018. Remixed Reality: Manipulating Space and Time in Augmented Reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems - CHI '18. ACM Press, New York, New York, USA, 129:1--129:13.
[17]
Fei Liu and Stefan Seipel. 2018. Precision study on augmented realitybased visual guidance for facility management tasks. Automation in Construction 90 (2018), 79--90.
[18]
J. Liu, F. Feng, Y. C. Nakamura, and N. S. Pollard. 2014. A taxonomy of everyday grasps in action. (Nov 2014), 573--580.
[19]
Walterio W Mayol-Cuevas, Ben J Tordoff, and David W Murray. 2009. On the choice and placement of wearable vision sensors. IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans 39, 2 (2009), 414--425.
[20]
James Mccrae, Niloy J Mitra, and Karan Singh. 2013. Surface perception of planar abstractions. ACM Transactions on Applied Perception (TAP) 10, 3 (2013), 14.
[21]
Shohei Mori, Sei Ikeda, and Hideo Saito. 2017. A survey of diminished reality: Techniques for visually concealing, eliminating, and seeing through real objects. IPSJ Transactions on Computer Vision and Applications 9, 1 (dec 2017), 17.
[22]
Shohei Mori, Momoko Maezawa, and Hideo Saito. 2017. A work area visualization by multi-view camera-based diminished reality. Multimodal Technologies and Interaction 1, 3 (2017), 18.
[23]
Nassir Navab, Joerg Traub, Tobias Sielhorst, Marco Feuerstein, and Christoph Bichlmeier. 2007. Action-and workflow-driven augmented reality for computer-aided medical procedures. IEEE Computer Graphics and Applications 27, 5 (2007), 10--14.
[24]
Tomislav Pejsa, Julian Kantor, Hrvoje Benko, Eyal Ofek, and Andrew D Wilson. 2016. Room2Room: Enabling Life-Size Telepresence in a Projected Augmented Reality Environment. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing - CSCW '16. ACM Press, New York, New York, USA, 1714--1723.
[25]
Rob Reilink, Gart de Bruin, Michel Franken, Massimo A Mariani, Sarthak Misra, and Stefano Stramigioli. 2010. Endoscopic camera control by head movements for thoracic surgery. In Biomedical Robotics and Biomechatronics (BioRob), 2010 3rd IEEE RAS and EMBS International Conference on. IEEE, 510--515.
[26]
Alberto Romay, Stefan Kohlbrecher, David C Conner, and Oskar Von Stryk. 2015. Achieving versatile manipulation tasks with unknown objects by supervised humanoid robots based on object templates. In Humanoids. 249--255.
[27]
Yves Rossetti, Kazuo Koga, and Tadaaki Mano. 1993. Prismatic displacement of vision induces transient changes in the timing of eye-hand coordination. Perception & Psychophysics 54, 3 (1993), 355--364.
[28]
Kaoru Sekiyama, Satoru Miyauchi, Toshihide Imaruoka, Hiroyuki Egusa, and Takara Tashiro. 2000. Body Image as a Visuomotor Transformation Device Revealed in Adaptation to Reversed Vision. Nature 407, 6802 (sep 2000), 374--377.
[29]
Roy Shilkrot, Jochen Huber, Jürgen Steimle, Suranga Nanayakkara, and Pattie Maes. 2015. Digital Digits: A Comprehensive Survey of Finger Augmentation Devices. ACM Comput. Surv. 48, 2, Article 30 (Nov. 2015), 29 pages.
[30]
Lee Stearns, Victor DeSouza, Jessica Yin, Leah Findlater, and Jon E. Froehlich. 2017. Augmented Reality Magnification for Low Vision Users with the Microsoft Hololens and a Finger-Worn Camera. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility - ASSETS '17. ACM Press, New York, New York, USA, 361--362.
[31]
Kazuya Sugimoto, Hiromitsu Fujii, Atsushi Yamashita, and Hajime Asama. 2014. Half-diminished reality image using three rgb-d sensors for remote control robots. In Safety, Security, and Rescue Robotics (SSRR), 2014 IEEE International Symposium on. IEEE, 1--6.
[32]
Robert J. Teather and Wolfgang Stuerzlinger. 2008. Exaggerated Head Motions for Game Viewpoint Control. In Proceedings of the 2008 Conference on Future Play: Research, Play, Share (Future Play '08). ACM, New York, NY, USA, 240--243.
[33]
Klen Copic Pucihar, Paul Coulton, and Jason Alexander. 2013. Evaluating Dual-view Perceptual Issues in Handheld Augmented Reality: Device vs. User Perspective Rendering. In Proceedings of the 15th ACM on International Conference on Multimodal Interaction (ICMI '13). ACM, New York, NY, USA, 381--388.
[34]
Colin Ware and Jeff Rose. 1999. Rotating virtual objects with real handles. ACM Transactions on Computer-Human Interaction (TOCHI) 6, 2 (1999), 162--180.
[35]
Robert B Welch and Gerald Goldstein. 1972. Prism adaptation and brain damage. Neuropsychologia 10, 4 (1972), 387--394.
[36]
Mark Wentink, Paul Breedveld, Dirk W Meijer, and Henk G Stassen. 2000. Endoscopic camera rotation: a conceptual solution to improve hand-eye coordination in minimally-invasive surgery. Minimally Invasive Therapy & Allied Technologies 9, 2 (2000), 125--131.
[37]
Xing-Dong Yang, Tovi Grossman, Daniel Wigdor, and George Fitzmaurice. 2012. Magic Finger: Always-available Input Through Finger Instrumentation. In Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology (UIST '12). ACM, New York, NY, USA, 147--156.
[38]
Tsuneo Yoshikawa. 2000. Force control of robot manipulators. 1 (2000), 220--226.
[39]
Stefanie Zollmann, Raphael Grasset, Gerhard Reitmayr, and Tobias Langlotz. 2014. Image-based X-ray visualization techniques for spatial understanding in Outdoor Augmented Reality. In Proceedings of the 26th Australian Computer-Human Interaction Conference on Designing Futures: The Future of Design. ACM, 194--203.

Cited By

View all
  • (2025)Motivational influence of virtual reality in physical therapy for children with cerebral palsy: a systematic review protocolBMJ Open10.1136/bmjopen-2023-07591215:1(e075912)Online publication date: 7-Jan-2025
  • (2024)Using the Visual Language of Comics to Alter Sensations in Augmented RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642351(1-17)Online publication date: 11-May-2024
  • (2024)ClockRay: A Wrist-Rotation Based Technique for Occluded-Target Selection in Virtual RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.323995130:7(3767-3778)Online publication date: Jul-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems
May 2019
9077 pages
ISBN:9781450359702
DOI:10.1145/3290605
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 May 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. augmented reality
  2. finger-camera
  3. manipulation task

Qualifiers

  • Research-article

Funding Sources

Conference

CHI '19
Sponsor:

Acceptance Rates

CHI '19 Paper Acceptance Rate 703 of 2,958 submissions, 24%;
Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)104
  • Downloads (Last 6 weeks)27
Reflects downloads up to 08 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Motivational influence of virtual reality in physical therapy for children with cerebral palsy: a systematic review protocolBMJ Open10.1136/bmjopen-2023-07591215:1(e075912)Online publication date: 7-Jan-2025
  • (2024)Using the Visual Language of Comics to Alter Sensations in Augmented RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642351(1-17)Online publication date: 11-May-2024
  • (2024)ClockRay: A Wrist-Rotation Based Technique for Occluded-Target Selection in Virtual RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.323995130:7(3767-3778)Online publication date: Jul-2024
  • (2024)Haptic MagnetismIEEE Transactions on Haptics10.1109/TOH.2023.329952817:2(152-164)Online publication date: Apr-2024
  • (2023)DensingQueen: Exploration Methods for Spatial Dense Dynamic DataProceedings of the 2023 ACM Symposium on Spatial User Interaction10.1145/3607822.3614535(1-12)Online publication date: 13-Oct-2023
  • (2023)DecluttAR: An Interactive Visual Clutter Dimming System to Help Focus on WorkProceedings of the Augmented Humans International Conference 202310.1145/3582700.3582718(159-170)Online publication date: 12-Mar-2023
  • (2023)Investigating Guardian Awareness Techniques to Promote Safety in Virtual Reality2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR55154.2023.00078(631-640)Online publication date: Mar-2023
  • (2023)AR Interfaces for Disocclusion—A Comparative Study2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR55154.2023.00068(530-540)Online publication date: Mar-2023
  • (2023)Design Patterns for Situated Visualization in Augmented RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.3327398(1-12)Online publication date: 2023
  • (2023)Wearable Augmented Reality: Research Trends and Future Directions from Three Major VenuesIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.332023129:11(4782-4793)Online publication date: Nov-2023
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media