Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3562939.3565607acmconferencesArticle/Chapter ViewAbstractPublication PagesvrstConference Proceedingsconference-collections
research-article

“Kapow!”: Studying the Design of Visual Feedback for Representing Contacts in Extended Reality

Published: 29 November 2022 Publication History

Abstract

In absence of haptic feedback, the perception of contact with virtual objects can rapidly become a problem in extended reality (XR) applications. XR developers often rely on visual feedback to inform the user and display contact information. However, as for today, there is no clear path on how to design and assess such visual techniques. In this paper, we propose a design space for the creation of visual feedback techniques meant to represent contact with virtual surfaces in XR. Based on this design space, we conceived a set of various visual techniques, including novel approaches based on onomatopoeia and inspired by cartoons, or visual effects based on physical phenomena. Then, we conducted an online preliminary user study with 60 participants, consisting in assessing 6 visual feedback techniques in terms of user experience. We could notably assess, for the first time, the potential influence of the interaction context by comparing the participants’ answers in two different scenarios: industrial versus entertainment conditions. Taken together, our design space and initial results could inspire XR developers for a wide range of applications in which the augmentation of contact seems prominent, such as for vocational training, industrial assembly/maintenance, surgical simulation, videogames, etc.

Supplementary Material

"Presentation video": StudyingVFDesign_Video.mp4 "Presentation teaser": StudyingVFDesign_Teaser.mp4 "Supplemental material": StudyingVFDesign_SuppMat.pdf (StudyingVFDesign_SuppMat.pdf)
MP4 File (StudyingVFDesign_Teaser.mp4)
"Presentation video": StudyingVFDesign_Video.mp4 "Presentation teaser": StudyingVFDesign_Teaser.mp4 "Supplemental material": StudyingVFDesign_SuppMat.pdf
MP4 File (StudyingVFDesign_Video.mp4)
"Presentation video": StudyingVFDesign_Video.mp4 "Presentation teaser": StudyingVFDesign_Teaser.mp4 "Supplemental material": StudyingVFDesign_SuppMat.pdf

References

[1]
Maadh Al Kalbani. 2019. Natural freehand grasping of virtual objects for augmented reality. Ph. D. Dissertation. Birmingham City University.
[2]
M. Al-Kalbani, I. Williams, and M. Frutos-Pascual. 2016. Analysis of Medium Wrap Freehand Virtual Object Grasping in Exocentric Mixed Reality. In 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE Computer Society, Los Alamitos, CA, USA, 84–93. https://doi.org/10.1109/ISMAR.2016.14
[3]
Frank Biocca, Yasuhiro Inoue, Andy Lee, and Heather Polinsky. 2002. Visual cues and virtual touch: Role of visual stimuli and intersensory integration in cross-modal haptic illusions and the sense of presence. Proceedings of presence (01 2002), 410–428.
[4]
A. Bloomfield and N. I. Badler. 2007. Collision Awareness Using Vibrotactile Arrays. In 2007 IEEE Virtual Reality Conference. IEEE Computer Society, Los Alamitos, CA, USA, 163–170. https://doi.org/10.1109/VR.2007.352477
[5]
Isak De Villiers Bosman, Koos De Beer, and Theo J.D. Bothma. 2021. Creating pseudo-tactile feedback in virtual reality using shared crossmodal properties of audio and tactile feedback. South African Computer Journal 33, 1 (2021), 1–21. https://doi.org/10.18489/sacj.v33i1.883 arXiv:https://journals.co.za/doi/pdf/10.18489/sacj.v33i1.883
[6]
George EP Box and David R Cox. 1964. An analysis of transformations. Journal of the Royal Statistical Society. Series B (Methodological) (1964), 211–252. https://doi.org/10.1111/j.2517-6161.1964.tb00553.x
[7]
Hugo Brument, Rebecca Fribourg, Gerard Gallagher, Thomas Howard, Flavien Lecuyer, Tiffany Luong, Victor Mercado, Etienne Peillard, Xavier De Tinguy, and Maud Marchal. 2019. Pyramid Escape: Design of Novel Passive Haptics Interactions for an Immersive and Modular Scenario. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, Osaka, Japan, 1409–1410. https://doi.org/10.1109/VR.2019.8797848
[8]
Ryan Canales, Aline Normoyle, Yu Sun, Yuting Ye, Massimiliano Di Luca, and Sophie Jörg. 2019. Virtual Grasping Feedback and Virtual Hand Ownership. In ACM Symposium on Applied Perception 2019 (Barcelona, Spain) (SAP ’19). Association for Computing Machinery, New York, NY, USA, Article 4, 9 pages. https://doi.org/10.1145/3343036.3343132
[9]
Hyeonah Choi, Jiwon Oh, Minwook Chang, and Gerard J. Kim. 2018. Effect of Accompanying Onomatopoeia to Interaction Sound for Altering User Perception in Virtual Reality. In Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology (Tokyo, Japan) (VRST ’18). Association for Computing Machinery, New York, NY, USA, Article 80, 2 pages. https://doi.org/10.1145/3281505.3281614
[10]
Natalia Cooper, Ferdinando Milella, Iain Cant, Carlo Pinto, Mark White, and Georg Meyer. 2015. The Effects of Multisensory Cues on the Sense of Presence and Task Performance in a Virtual Reality Environment. Lecco, Italy. https://livrepository.liverpool.ac.uk/2035799
[11]
Erica Cosentino. 2021. Artifacts and affordances. Synthese 198, 17 (2021), 4007–4026. https://doi.org/10.1007/s11229-019-02297-4
[12]
L. Fabiani, D. Gomez, N. Langrana, and G. Burdea. 1996. Human interface using the Rutgers Master II force feedback interface. In Virtual Reality Annual International Symposium. IEEE Computer Society, Los Alamitos, CA, USA, 54. https://doi.org/10.1109/VRAIS.1996.490510
[13]
Jacqueline M Fulvio and Bas Rokers. 2017. Use of cues in virtual reality depends on visual feedback. Scientific reports 7, 1 (2017), 1–13. https://doi.org/10.1038/s41598-017-16161-3
[14]
Andreas Geiger, Imke Bewersdorf, Elisabeth Brandenburg, and Rainer Stark. 2018. Visual Feedback for Grasping in Virtual Reality Environments for an Interface to Instruct Digital Human Models. In Advances in Usability and User Experience, Tareq Ahram and Christianne Falcão (Eds.). Springer International Publishing, Cham, 228–239. https://link.springer.com/chapter/10.1007/978-3-319-60492-3_22
[15]
Wenjun Hou, Shupeng Zhang, and Zhiyang Jiang. 2018. Research on Visual Feedback Based on Natural Gesture. In Human Centered Computing, Qiaohong Zu and Bo Hu (Eds.). Springer International Publishing, Cham, 627–637. https://doi.org/10.1007/978-3-319-74521-3_65
[16]
Dhruv Jain, Sasa Junuzovic, Eyal Ofek, Mike Sinclair, John Porter, Chris Yoon, Swetha Machanavajhala, and Meredith Ringel Morris. 2021. A Taxonomy of Sounds in Virtual Reality. In Designing Interactive Systems Conference 2021(DIS ’21). Association for Computing Machinery, New York, NY, USA, 160–170. https://doi.org/10.1145/3461778.3462106
[17]
Hanseob Kim, Myungho Lee, Gerard J. Kim, and Jae In Hwang. 2021. The impacts of visual effects on user perception with a virtual human in augmented reality conflict situations. IEEE Access 9(2021), 35300–35312. https://doi.org/10.1109/ACCESS.2021.3062037 Publisher Copyright: © 2013 IEEE.
[18]
Nataliya Kosmyna and Anatole Lécuyer. 2017. Designing Guiding Systems for Brain-Computer Interfaces. Frontiers in Human Neuroscience 11 (July 2017), 396. https://doi.org/10.3389/fnhum.2017.00396
[19]
Meng Chun Lam, Haslina Arshad, Anton Satria Prabuwono, Siok Yee Tan, and S. M. Kahaki. 2018. Interaction Techniques in Desktop Virtual Environment: The Study of Visual Feedback and Precise Manipulation Method. Multimedia Tools Appl. 77, 13 (July 2018), 16367–16398.
[20]
Anatole Lécuyer, Christine Mégard, Jean-Marie Burkhardt, Taegi Lim, Sabine Coquillart, Philippe Coiffet, and Ludovic Graux. 2002. The effect of haptic, visual and auditory feedback on an insertion task on a 2-screen workbench. In Proceedings of the Immersive Projection Technology Symposium, Vol. 2.
[21]
Fabrice Matulic, Aditya Ganeshan, Hiroshi Fujiwara, and Daniel Vogel. 2021. Phonetroller: Visual Representations of Fingers for Precise Touch Input with Mobile Phones in VR. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, Article 129, 13 pages. https://doi.org/10.1145/3411764.3445583
[22]
Scott McCloud. 1994. Understanding Comics : the Invisible Art. HarperPerennial, New York, NY, USA. 215 pages.
[23]
M Moehring and B Froehlich. 2011. Effective Manipulation of Virtual Objects within Arm’s Reach. In Proceedings of the 2011 IEEE Virtual Reality Conference(VR ’11). IEEE Computer Society, USA, 131–138. https://doi.org/10.1109/VR.2011.5759451
[24]
Peter Mohr, David Mandl, Markus Tatzgern, Eduardo Veas, Dieter Schmalstieg, and Denis Kalkofen. 2017. Retargeting Video Tutorials Showing Tools With Surface Contact to Augmented Reality. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’17). Association for Computing Machinery, New York, NY, USA, 6547–6558. https://doi.org/10.1145/3025453.3025688
[25]
Ryota Nomura, Yuko Unuma, Takashi Komuro, Shoji Yamamoto, and Norimichi Tsumura. 2017. Mobile Augmented Reality for Providing Perception of Materials. In Proceedings of the 16th International Conference on Mobile and Ubiquitous Multimedia (Stuttgart, Germany) (MUM ’17). Association for Computing Machinery, New York, NY, USA, 501–506. https://doi.org/10.1145/3152832.3157810
[26]
Tomi Nukarinen, Jari Kangas, Jussi Rantala, Toni Pakkanen, and Roope Raisamo. 2018. Hands-Free Vibrotactile Feedback for Object Selection Tasks in Virtual Reality. In Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology(Tokyo, Japan) (VRST ’18). Association for Computing Machinery, New York, NY, USA, Article 94, 2 pages. https://doi.org/10.1145/3281505.3283375
[27]
Michael I. Posner, Mary J. Nissen, and Raymond M. Klein. 1976. Visual dominance: An information-processing account of its origins and significance. Psychological Review 83(1976), 157–171. https://doi.org/10.1037/0033-295X.83.2.157 Place: US Publisher: American Psychological Association.
[28]
M. Prachyabrued and C. W. Borst. 2014. Visual feedback for virtual grasping. In 2014 IEEE symposium on 3D user interfaces (3DUI). IEEE, Minneapolis, MN, USA, 19–26. https://doi.org/10.1109/3DUI.2014.6798835
[29]
Sara Price, Carey Jewitt, and Nikoleta Yiannoutsou. 2021. Conceptualising touch in VR. Virtual Reality 25 (09 2021), 863–877. https://doi.org/10.1007/s10055-020-00494-y
[30]
Oscar Salandin. 2021. Touching Holograms - How the Microsoft Mixed Reality Studios explored more immersive interactions. https://medium.com/microsoft-design/blank-story-e286ac1fb11a Accessed: 2021-05-17.
[31]
Mie Sato, Sota Suzuki, Daiki Ebihara, Sho Kato, and Sato Ishigaki. 2016. Pseudo-Softness Evaluation in Grasping a Virtual Object with a Bare Hand. In SIGGRAPH Posters. ACM, New York, NY, USA, Article 40, 1 pages. https://doi.org/10.1145/2945078.2945118
[32]
Martin Schrepp, Andreas Hinderks, and Jörg Thomaschewski. 2014. Applying the User Experience Questionnaire (UEQ) in Different Evaluation Scenarios. In Design, User Experience, and Usability. Theories, Methods, and Tools for Designing the User Experience, David Hutchison, Takeo Kanade, Josef Kittler, Jon M. Kleinberg, Alfred Kobsa, Friedemann Mattern, John C. Mitchell, Moni Naor, Oscar Nierstrasz, C. Pandu Rangan, Bernhard Steffen, Demetri Terzopoulos, Doug Tygar, Gerhard Weikum, and Aaron Marcus (Eds.). Vol. 8517. Springer, 383–392. https://doi.org/10.1007/978-3-319-07668-3_37
[33]
Jean Sreng and Anatole Lécuyer. 2013. Simulating Contacts between Objects in Virtual Reality with Auditory, Visual, and Haptic Feedback. In Sonic Interaction Design, K. Franinović and S. Serafin (Eds.). MIT Press, 309–326. https://doi.org/10.7551/mitpress/8555.003.0021
[34]
Jean Sreng, Anatole Lécuyer, Christine Megard, and Claude Andriot. 2006. Using Visual Cues of Contact to Improve Interactive Manipulation of Virtual Objects in Industrial Assembly/Maintenance Simulations. IEEE Transactions on Visualization and Computer Graphics 12, 5 (Sept. 2006), 1013–1020. https://doi.org/10.1109/TVCG.2006.189
[35]
Sota Suzuki, Haruto Suzuki, and Mie Sato. 2014. Grasping a Virtual Object with a Bare Hand. In SIGGRAPH Posters. ACM, Article 51, 1 pages. https://doi.org/10.1145/2614217.2630574
[36]
Ultraleap. 2021. Virtual Objects - Ultraleap documentation. https://docs.ultraleap.com/xr-guidelines/virtual-objects.html#affordances Accessed: 2022-09-19.
[37]
Sebastian Vizcay, Panagiotis Kourtesis, Ferran Argelaguet, Claudio Pacchierotti, and Maud Marchal. 2021. Electrotactile Feedback For Enhancing Contact Information in Virtual Reality. ICAT-EGVE 2021 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments (2021), 10 pages. https://doi.org/10.2312/egve.20211331 arXiv:2102.00259.
[38]
Spyros Vosinakis and Panayiotis Koutsabasis. 2018. Evaluation of Visual Feedback Techniques for Virtual Grasping with Bare Hands Using Leap Motion and Oculus Rift. Virtual Real. 22, 1 (March 2018), 47–62. https://doi.org/10.1007/s10055-017-0313-4
[39]
Benjamin Williams, Alexandra E. Garton, and Christopher J. Headleand. 2020. Exploring Visuo-haptic Feedback Congruency in Virtual Reality. In 2020 International Conference on Cyberworlds (CW). 102–109. https://doi.org/10.1109/CW49994.2020.00022 ISSN: 2642-3596.
[40]
Jacob O. Wobbrock, Leah Findlater, Darren Gergle, and James J. Higgins. 2011. The Aligned Rank Transform for Nonparametric Factorial Analyses Using Only Anova Procedures. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Vancouver, BC, Canada) (CHI ’11). ACM, New York, NY, USA, 143–146. https://doi.org/10.1145/1978942.1978963
[41]
Wenge Xu, Hai-Ning Liang, Yuzheng Chen, Xiang Li, and Kangyou Yu. 2020. Exploring Visual Techniques for Boundary Awareness During Interaction in Augmented Reality Head-Mounted Displays. In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, Atlanta, GA, USA, 204–211. https://doi.org/10.1109/VR46266.2020.00039
[42]
Mihaela A. Zahariev and Christine L. MacKenzie. 2003. Auditory, Graphical and Haptic Contact Cues for a Reach, Grasp, and Place Task in an Augmented Environment. In Proceedings of the 5th International Conference on Multimodal Interfaces (Vancouver, British Columbia, Canada) (ICMI ’03). Association for Computing Machinery, New York, NY, USA, 273–276. https://doi.org/10.1145/958432.958481
[43]
Zhiyuan Zhang, Bing Wang, F. Ahmed, I. V. Ramakrishnan, Rong Zhao, A. Viccellio, and K. Mueller. 2013. The Five Ws for Information Visualization with Application to Healthcare Informatics. IEEE Transactions on Visualization and Computer Graphics 19, 11 (Nov. 2013), 1895–1910. https://doi.org/10.1109/TVCG.2013.89

Cited By

View all
  • (2024)Juicy Text: Onomatopoeia and Semantic Text Effects for Juicy Player ExperiencesProceedings of the 26th International Conference on Multimodal Interaction10.1145/3678957.3685755(144-153)Online publication date: 4-Nov-2024
  • (2024)Using the Visual Language of Comics to Alter Sensations in Augmented RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642351(1-17)Online publication date: 11-May-2024
  • (2024)Softness Perception of Visual Objects Controlled by Touchless Inputs: The Role of Effective Distance of Hand MovementsIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.325452230:7(4154-4169)Online publication date: Jul-2024
  • Show More Cited By

Index Terms

  1. “Kapow!”: Studying the Design of Visual Feedback for Representing Contacts in Extended Reality

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      VRST '22: Proceedings of the 28th ACM Symposium on Virtual Reality Software and Technology
      November 2022
      466 pages
      ISBN:9781450398893
      DOI:10.1145/3562939
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 29 November 2022

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Contact
      2. Design Space
      3. Extended Reality
      4. User Experience
      5. Visual Feedback

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Conference

      VRST '22

      Acceptance Rates

      Overall Acceptance Rate 66 of 254 submissions, 26%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)94
      • Downloads (Last 6 weeks)4
      Reflects downloads up to 23 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Juicy Text: Onomatopoeia and Semantic Text Effects for Juicy Player ExperiencesProceedings of the 26th International Conference on Multimodal Interaction10.1145/3678957.3685755(144-153)Online publication date: 4-Nov-2024
      • (2024)Using the Visual Language of Comics to Alter Sensations in Augmented RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642351(1-17)Online publication date: 11-May-2024
      • (2024)Softness Perception of Visual Objects Controlled by Touchless Inputs: The Role of Effective Distance of Hand MovementsIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.325452230:7(4154-4169)Online publication date: Jul-2024
      • (2024)"XR Is A Game": Exploring People’s Reasons for Using, Limiting, and Abandoning Extended Reality2024 IEEE International Conference on Real-time Computing and Robotics (RCAR)10.1109/RCAR61438.2024.10670927(271-278)Online publication date: 24-Jun-2024

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media