Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3313831.3376353acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Virtual Reality Without Vision: A Haptic and Auditory White Cane to Navigate Complex Virtual Worlds

Published: 23 April 2020 Publication History

Abstract

Current Virtual Reality (VR) technologies focus on rendering visuospatial effects, and thus are inaccessible for blind or low vision users. We examine the use of a novel white cane controller that enables navigation without vision of large virtual environments with complex architecture, such as winding paths and occluding walls and doors. The cane controller employs a lightweight three-axis brake mechanism to provide large-scale shape of virtual objects. The multiple degrees-of-freedom enables users to adapt the controller to their preferred techniques and grip. In addition, surface textures are rendered with a voice coil actuator based on contact vibrations; and spatialized audio is determined based on the progression of sound through the geometry around the user. We design a scavenger hunt game that demonstrates how our device enables blind users to navigate a complex virtual environment. Seven out of eight users were able to successfully navigate the virtual room (6x6m) to locate targets while avoiding collisions. We conclude with design consideration on creating immersive non-visual VR experiences based on user preferences for cane techniques, and cane material properties.

Supplementary Material

SRT File (paper226pvc.srt)
Preview video captions
ZIP File (paper226vfc.zip)
Video figure captions
MP4 File (paper226vf.mp4)
Supplemental video
MP4 File (paper226pv.mp4)
Preview video
MP4 File (a226-siu-presentation.mp4)

References

[1]
Parastoo Abtahi, Mar Gonzalez-Franco, Eyal Ofek, and Anthony Steed. 2019. I'm a Giant: Walking in Large Virtual Environments at High Speed Gains. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19). Association for Computing Machinery, New York, NY, USA, Paper 522, 1--13.
[2]
Grace Ambrose-Zaken. 2005. Knowledge of and preferences for long cane components: A qualitative and quantitative study. Journal of Visual Impairment & Blindness, 99(10), 633--645.
[3]
B. B. Blasch, Steven J. LaGrow, and W. R. De l'Aune. 1996. Three aspects of coverage provided by the long cane: Object, surface, and foot-placement preview. Journal of Visual Impairment and Blindness, 90, 295301.
[4]
Erin Connors, Elizabeth R. Chrastil, Jaimie Sánchez and Ltfi B. Merabet. 2014. Action video game play and transfer of navigation and spatial cognition skills in adolescents who are blind. Frontiers in Human Neuroscience, 133.
[5]
Edoardo D'Atri, Carlo Maria Medaglia, Alexandru Serbanati, Ugo Biader Ceipidor, Emanuele Panizzi and Alessandro D'Atri. 2007. A system to aid blind people in the mobility: A usability test and its results. In Second International Conference on Systems (ICONS'07) (pp. 35--35). IEEE.
[6]
Massimiliano Gabardi, Massimiliano Solazzi, Daniele Leonardis and Antonio Frisoli. 2016. A new wearable fingertip haptic interface for the rendering of virtual shapes and surface features. In 2016 IEEE Haptics Symposium (HAPTICS) (pp. 140--146). IEEE.
[7]
Nicholas A. Giudice. (2018). Navigating without vision: principles of blind spatial cognition. Handbook of behavioral and cognitive geography. Edward Elgar Publishing.
[8]
José L. González-Mora, L. Rodriguez-Hernandez, L. F. Rodriguez-Ramos, L. Díaz-Saco, and N. Sosa. 1999. Development of a new space perception system for blind people, based on the creation of a virtual acoustic space. In International Work-Conference on Artificial Neural Networks. (pp. 321--330). Springer, Berlin, Heidelberg.
[9]
William Henry Jacobson. (1993). The art and science of teaching orientation and mobility to persons with visual impairments. American Foundation for the Blind.
[10]
HTC Vive. Retrieved September 13, 2019 from https://www.vive.com/us/.
[11]
Dae Shik Kim and Robert Wall Emerson. 2012. Effect of cane length on drop-off detection performance. Journal of Visual Impairment & Blindness, 106(1), 3135.
[12]
Dae Shik Kim, Robert Wall Emerson and Amy B. Curtis. 2010. Ergonomic factors related to drop-off detection with the long cane: Effects of cane tips and techniques. Human Factors, 52(3), 456--465.
[13]
Dae Shik Kim, Robert Wall Emerson, Kooroosh Naghshineh and Alexander Auer. 2017. Drop-off detection with the long cane: Effect of cane shaft weight and rigidity on performance. Ergonomics, 60(1), 59--68.
[14]
Julian Kreimeier and Timo Götzelmann. 2019. First Steps Towards Walk-In-Place Locomotion and Haptic Feedback in Virtual Reality for Visually Impaired. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems (CHI EA '19). Association for Computing Machinery, New York, NY, USA, Paper LBW2214, 1--6.
[15]
Katherine K. Kuchenbecker, Jonathan Fiene and Günter Niemeyer. 2006. Improving contact realism through event-based haptic feedback. IEEE Transactions on Visualization and Computer Graphics, 12.2, 219--230.
[16]
Steven J. LaGrow, B. B. Blasch and William De l'Aune. 1997. The effect of hand position on detection distance for object and surface preview when using the long cane for nonvisual travel. RE: view, 28(4), 169.
[17]
Steven J. LaGrow and Marvin J. Weessies. 1994. Orientation and mobility: Techniques for independence. Palmerston North, New Zealand: Dunmore Press.
[18]
Orly Lahav and David Mioduser. 2001. Multisensory virtual environment for supporting blind persons' acquisition of spatial cognitive mapping--a case study. In EdMedia+ Innovate Learning. Association for the Advancement of Computing in Education (AACE), 10461051.
[19]
Orly Lahav and David Mioduser. 2004. Exploration of unknown spaces by people who are blind, using a multisensory virtual environment (MVE). Journal of Special Education Technology, 19, 3, 15--23.
[20]
Orly Lahav and David Mioduser. 2008. Hapticfeedback support for cognitive mapping of unknown spaces by people who are blind. International Journal of Human-Computer Studies, 66(1), 23--35.
[21]
Orly Lahav, D Schloerb, S Kummar and M A Srinivasan. 2011. A virtual map to support people who are blind to navigate through real spaces. Journal of Special Education Technology, 26, 4.
[22]
Orly Lahav, David W. Schloerb and Mandayam A. Srinivasan. 2015. Virtual environments for people who are visually impaired integrated into an orientation and mobility program. Journal of Visual Impairment & Blindness, 109(1), 5--16.
[23]
Nils Landin, Joseph M. Romano, William McMahan and Katherine J. Kuchenbecker. 2010. Dimensional reduction of high-frequency accelerations for haptic rendering. In International conference on human haptic sensing and touch enabled computer applications, pp. 79--86. Springer, Berlin, Heidelberg, 2010.
[24]
Anatole Lécuyer, Pascal Mobuchon, Christine Mégard, Jérôme Perret, Claude Andriot and J. P. Colinot. 2003. HOMERE: a multimodal system for visually impaired people to explore virtual environments. In Proceedings of IEEE Virtual Reality. IEEE. (pp. 251--258).
[25]
Eric Linberg and Tommy Garling. 1983. Acquisition of different types of locational information in cognitive maps: Automatic or effortful processing? In Psychological Research, 45, 19--38.
[26]
Maruricio Lumbreras and Jaime Sánchez. 1999. Interactive 3D sound hyperstories for blind children. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems (CHI '99). Association for Computing Machinery, New York, NY, USA, 318-- 325.
[27]
Shachar Maidenbaum, Shelly Levy-Tzedek, DanielRobert Chebat and Amir Amedi. 2013. Increasing accessibility to the blind of virtual environments, using a virtual mobility aid based on the" EyeCane": Feasibility study. PloS one 8, 8, e72555.
[28]
Shachar Maidenbaum and Amir Amedi. 2015. Nonvisual virtual interaction: Can Sensory Substitution generically increase the accessibility of Graphical virtual reality to the blind? In 2015 3rd IEEE VR International Workshop on Virtual and Augmented Assistive Technology (VAAT). IEEE. pp. 15--17.
[29]
Mayer L. Max and Jesse R. Gonzalez. 1997. Blind persons navigate in virtual reality (VR); hearing and feeling communicates reality. Studies in health technology and informatics. 39, 54--59.
[30]
Ravish Mehra, Nikunj Raghuvanshi, Lauri Savioja, Ming C. Lin, and Dinesh Manocha. 2012. An efficient GPU-based time domain solver for the acoustic wave equation. In Applied Acoustics. 73(2), 83--94.
[31]
Frank Meijer, Branko L. Geudeke, and Egon L. Van den Broek. 2009. Navigating through virtual environments: Visual realism improves spatial cognition. In CyberPsychology & Behavior. 12(5), 517--521.
[32]
Lofti B. Merabet, Erin C. Connors, Mark A. Halko and Jaime Sanchez. 2012. Teaching the Blind to Find Their Way by Playing Video Games. PloS one 7, no. 9 (2012): e44958.
[33]
Richard Mettler. 1998. Cognitive learning theory and cane travel instruction: A new paradigm. DIANE Publishing.
[34]
Shannon D. Moeser. 1988. Cognitive mapping in a complex building. In Environment and Behavior. 20, 21--49.
[35]
NFB Canes. Retrieved August 9, 2019 from https://www.nfb.org/programs-services/free-whitecane-program. NFB canes.
[36]
Kiyohiko Nunokawa, Y. Seki, Shuichi Ino and K. Doi. 2014. Judging hardness of an object from the sounds of tapping created by a white cane. In 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE, pp. 5876--5879.
[37]
Makoto Ohuchi, Yukio Iwaya, and Yôiti Suzuki. 2006. Cognitive-map forming of the blind in virtual sound environment. Georgia Institute of Technology, 2006. http://hdl.handle.net/1853/50588.
[38]
Project Acoustics. 2019. (3 March 2019). Retrieved September 6, 2019 from https://aka.ms/acoustics.
[39]
Ludek Pokluda and Jiri Sochor. 2003. Spatial haptic orientation for visually impaired people. In Eurographics.
[40]
Nikunj Raghuvanshi and John Snyder. 2018. Parametric directional coding for precomputed sound propagation. ACM Trans. Graph. 37, 4, Article 108 (July 2018), 14 pages.
[41]
Nikunj Raghuvanshi, John Snyder, Ravish Mehra, Ming Lin, and Naga Govindaraju. 2010. Precomputed wave simulation for real-time sound propagation of dynamic sources in complex scenes. ACM Trans. Graph. 29, 4, Article 68 (July 2010), 11 pages.
[42]
Reardon, S. 2011. Playing by ear. Science, 333(6051), 1816.
[43]
Anthony E. Richardson, Daniel R. Montello and Mary Hegarty. 1999. Spatial knowledge acquisition from maps and from navigation in real and virtual environments. Memory & Cognition. 27(4), 741--750.
[44]
Mark D. Rodgers and Robert W. Emerson. 2005. Human factor analysis of long cane design: Weight and length. Journal of Visual Impairment & Blindness. 99(10), 622--632.
[45]
Joseph M. Romano and Katherine J. Kuchenbecker. 2011. Creating realistic virtual textures from contact acceleration data. In IEEE Transactions on Haptics. 5(2), 109--119.
[46]
Joseph M. Romano, Takashi Yoshioka and Katherine J. Kuchenbecker. 2010. Automatic filter design for synthesis of haptic textures from recorded acceleration data. In 2010 IEEE International Conference on Robotics and Automation. IEEE. pp. 1815--1821.
[47]
Sandra Rosen. Long Cane Techniques: Study Guide. Retrieved August 9, 2019 from https://tech.aph.org/sbs/04_sbs_lc_study.html#12
[48]
Jaime Sánchez and Mauricio Sáenz. 2010. Metro navigation for the blind. In Computers & Education. 55(3), 970--981.
[49]
David W. Schloerb, Orly Lahav, Joseph G. Deslogeand Mandayam A. Srinivasan. 2010. BlindAid: Virtual environment system for self-reliant trip planning and orientation and mobility training. In 2010 IEEE Haptics Symposium. IEEE. pp. 363--370.
[50]
Martijn J. Schuemie, Peter Van Der Straaten, Merel Krijn and Charles A. Van Der Mast. 2001. Research on presence in virtual reality: A survey. In CyberPsychology & Behavior. 4(2), 183--201.
[51]
Yoshikazu Seki and Tetsuji Sato, T. 2010. A training system of orientation and mobility for blind people using acoustic virtual reality. IEEE Transactions on neural systems and rehabilitation engineering. 19(1), 95104.
[52]
Alexander W. Siegel and Sheldon H. White. 1975. The development of spatial representations of large-scale environments. In Advances in Child Development and Behavior. New York: Academic Press. Vol. 10, (pp 10--55).
[53]
Mike Sinclair, Eyal Ofek, Mar Gonzalez-Franco, and Christian Holz. 2019. CapstanCrunch: A Haptic VR Controller with User-supplied Force Feedback. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST '19). Association for Computing Machinery, New York, NY, USA, 815--829.
[54]
Dimitrios Tzovaras, Georgios Nikolakis, Georgios Fergadis, Stratos Malasiotis and Modestos Stavrakis. 2004. Design and implementation of haptic virtual environments for the training of the visually impaired. In IEEE Transactions on Neural Systems and Rehabilitation Engineering. 12, 2, pp. 266--278.
[55]
David Waller, Earl Hunt and David Knapp. 1998. The transfer of spatial knowledge in virtual environment training. Presence. 7(2), 129--143.
[56]
Wai Yu and Stephen Brewster. 2002. Multimodal virtual reality versus printed medium in visualization for blind people. In Proceedings of the fifth international ACM conference on Assistive technologies (Assets '02). Association for Computing Machinery, New York, NY, USA, 57--64.
[57]
Yuhang Zhao, Cynthia L. Bennett, Hrvoje Benko, Edward Cutrell, Christian Holz, Meredith Ringel Morris, and Mike Sinclair. 2018. Enabling People with Visual Impairments to Navigate Virtual Reality with a Haptic and Auditory Cane Simulation. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18). Association for Computing Machinery, New York, NY, USA, Paper 116, 1-- 14.

Cited By

View all
  • (2024)Strategies for Cultivating Students’ Spatial Imagination in Art Courses of Colleges and Universities Assisted by Virtual Reality TechnologyApplied Mathematics and Nonlinear Sciences10.2478/amns-2024-33809:1Online publication date: 18-Nov-2024
  • (2024)Exploring Human Interaction in Virtual Reality: An Experience Report on Users with and without Visual ImpairmentProceedings of the XXIII Brazilian Symposium on Human Factors in Computing Systems10.1145/3702038.3702047(1-11)Online publication date: 7-Oct-2024
  • (2024)AudioMove: Applying the Spatial Audio to Multi-Directional Limb Exercise GuidanceProceedings of the ACM on Human-Computer Interaction10.1145/36764898:MHCI(1-26)Online publication date: 24-Sep-2024
  • Show More Cited By

Index Terms

  1. Virtual Reality Without Vision: A Haptic and Auditory White Cane to Navigate Complex Virtual Worlds

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems
      April 2020
      10688 pages
      ISBN:9781450367080
      DOI:10.1145/3313831
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 23 April 2020

      Permissions

      Request permissions for this article.

      Check for updates

      Badges

      • Honorable Mention

      Author Tags

      1. auditory feedback
      2. blindness
      3. haptic feedback
      4. mobility, 3d audio
      5. virtual reality
      6. visual impairments
      7. white cane

      Qualifiers

      • Research-article

      Conference

      CHI '20
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

      Upcoming Conference

      CHI 2025
      ACM CHI Conference on Human Factors in Computing Systems
      April 26 - May 1, 2025
      Yokohama , Japan

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)302
      • Downloads (Last 6 weeks)30
      Reflects downloads up to 14 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Strategies for Cultivating Students’ Spatial Imagination in Art Courses of Colleges and Universities Assisted by Virtual Reality TechnologyApplied Mathematics and Nonlinear Sciences10.2478/amns-2024-33809:1Online publication date: 18-Nov-2024
      • (2024)Exploring Human Interaction in Virtual Reality: An Experience Report on Users with and without Visual ImpairmentProceedings of the XXIII Brazilian Symposium on Human Factors in Computing Systems10.1145/3702038.3702047(1-11)Online publication date: 7-Oct-2024
      • (2024)AudioMove: Applying the Spatial Audio to Multi-Directional Limb Exercise GuidanceProceedings of the ACM on Human-Computer Interaction10.1145/36764898:MHCI(1-26)Online publication date: 24-Sep-2024
      • (2024)From Skepticism to Acceptance: On the Dynamics of Elderly Engagement with Mixed RealityProceedings of Mensch und Computer 202410.1145/3670653.3670666(67-82)Online publication date: 1-Sep-2024
      • (2024)Speed-of-Light VR for Blind People: Conveying the Location of Arm-Reach TargetsProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3688533(1-5)Online publication date: 27-Oct-2024
      • (2024)Accessible Nonverbal Cues to Support Conversations in VR for Blind and Low Vision PeopleProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675663(1-13)Online publication date: 27-Oct-2024
      • (2024)Towards Accessible Musical Performances in Virtual Reality: Designing a Conceptual Framework for Omnidirectional Audio DescriptionsProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675618(1-17)Online publication date: 27-Oct-2024
      • (2024)Stick-To-XR: Understanding Stick-Based User Interface Design for Extended RealityProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661627(168-179)Online publication date: 1-Jul-2024
      • (2024)SoundShift: Exploring Sound Manipulations for Accessible Mixed-Reality AwarenessProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661556(116-132)Online publication date: 1-Jul-2024
      • (2024)SocialCueSwitch: Towards Customizable Accessibility by Representing Social Cues in Multiple SensesExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3651109(1-7)Online publication date: 11-May-2024
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media