Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

X-Road: Virtual Reality Glasses for Orientation and Mobility Training of People with Visual Impairments

Published: 25 April 2020 Publication History
  • Get Citation Alerts
  • Abstract

    Orientation and Mobility (O8M) classes teach people with visual impairments how to navigate the world; for instance, how to cross a road. Yet, this training can be difficult and dangerous due to conditions such as traffic and weather. Virtual Reality (VR) can overcome these challenges by providing interactive controlled environments. However, most existing VR tools rely on visual feedback, which limits their use with students with visual impairment. In a collaborative design approach with O8M instructors, we designed an affordable and accessible VR system for O8M classes, called X-Road. Using a smartphone and a Bespoke headmount, X-Road provides both visual and audio feedback and allows users to move in space as in the real world. In a study with 13 students with visual impairments, X-Road proved to be an effective alternative to teaching and learning classical O8M tasks, and both students and instructors were enthusiastic about this technology. We conclude with design recommendations for inclusive VR systems.

    References

    [1]
    Dragan Ahmetovic, Cole Gleason, Chengxiong Ruan, Kris Kitani, Hironobu Takagi, and Chieko Asakawa. 2016. NavCog: A navigational cognitive assistant for the blind. In Proceedings of the 18th International Conference on Human-computer Interaction with Mobile Devices and Services (MobileHCI’16). ACM, New York, NY, 90--99.
    [2]
    Jérémy Albouys-Perrois, Jérémy Laviole, Carine Briant, and Anke M. Brock. 2018. Towards a multisensory augmented reality map for blind and low vision people: A participatory design approach. In Proceedings of the CHI Conference on Human Factors in Computing Systems (CHI’18). ACM, New York, NY.
    [3]
    Kazuma Aoyama, Makoto Mizukami, Taro Maeda, and Hideyuki Ando. 2016. Modeling the enhancement effect of countercurrent on acceleration perception in galvanic vestibular stimulation. In Proceedings of the 13th International Conference on Advances in Computer Entertainment Technology. ACM, 6.
    [4]
    Bruno Arnaldi, Pascal Guitton, and Guillaume Moreau. 2018. Virtual Reality and Augmented Reality: Myths and Realities. Wiley. Retrieved from https://www.wiley.com/en-fr/Virtual+Reality+and+Augmented+Reality:+Myths+and+Realities-p-9781786301055.
    [5]
    R. Azuma, Y. Baillot, R. Behringer, S. Feiner, S. Julier, and B. MacIntyre. 2001. Recent advances in augmented reality. IEEE Comput. Graph. Applic. 21, 6 (2001), 34--47.
    [6]
    Durand R. Begault. 1994. 3DD Sound for Virtual Reality and Multimedia. Academic Press Professional, Inc.
    [7]
    Mark Billinghurst, Hirokazu Kato, and Ivan Poupyrev. 2001. The magicbook-moving seamlessly between reality and virtuality. IEEE Comput. Graph. Applic. 21, 3 (2001), 6--8.
    [8]
    Ian Bishop and Muhammad Rizwan Abid. 2018. Survey of locomotion systems in virtual reality. In Proceedings of the 2nd International Conference on Information System and Data Mining (ICISDM’18). ACM, New York, NY, 151--154.
    [9]
    Jeffrey R. Blum, Mathieu Bouchard, and Jeremy R. Cooperstock. 2013. Spatialized audio environmental awareness for blind users with a smartphone. Mob. Netw. Applic. 18, 3 (01 June 2013), 295--309.
    [10]
    Mourad Bouzit, Grigore Burdea, George Popescu, and Rares Boian. 2002. The Rutgers Master II-new design force-feedback glove. IEEE/ASME Trans. Mechatr. 7, 2 (2002), 256--263.
    [11]
    Doug A. Bowman and Ryan P. McMahan. 2007. Virtual reality: How much immersion is enough? Computer 40, 7 (2007).
    [12]
    Ellen Lambert Bowman and Lei Liu. 2017. Individuals with severely impaired vision can learn useful orientation and mobility skills in virtual streets and can use them to improve real street safety. PLoS One 12, 4 (2017), e0176534.
    [13]
    Evren Bozgeyikli, Andrew Raij, Srinivas Katkoori, and Rajiv Dubey. 2016. Point 8 teleport locomotion technique for virtual reality. In Proceedings of the Symposium on Computer-Human Interaction in Play. ACM, 205--216.
    [14]
    Evren Bozgeyikli, Andrew Raij, Srinivas Katkoori, and Rajiv Dubey. 2019. Locomotion in virtual reality for room scale tracked areas. Int. J. Hum.-comput. Stud. 122 (2019), 38--49.
    [15]
    Anke M. Brock, Philippe Truillet, Bernard Oriola, Delphine Picard, and Christophe Jouffrais. 2014. Interactivity improves usability of geographic maps for visually impaired people. Hum.-computer Interact. 30 (2014), 156--194.
    [16]
    Abbie Brown and Tim Green. 2016. Virtual reality: Low-cost tools and resources for the classroom. TechTrends 60, 5 (2016), 517--519.
    [17]
    Zaira Cattaneo and Tomaso Vecchi. 2011. Blind Vision: The Neuroscience of Visual Impairment. The MIT Press. 269 pages.
    [18]
    Erin C. Connors, Elizabeth R. Chrastil, Jaime Sánchez, and Lotfi B. Merabet. 2014. Virtual environments for the transfer of navigation skills in the blind: A comparison of directed instruction vs. video game based learning approaches. Front. Hum. Neurosci. 8 (2014), 223.
    [19]
    Carolina Cruz-Neira, Daniel J. Sandin, Thomas A. DeFanti, Robert V. Kenyon, and John C. Hart. 1992. The CAVE: Audio visual experience automatic virtual environment. Commun. ACM 35, 6 (1992), 64--73.
    [20]
    Florian Dramas, Brian F. Katz, and Christophe Jouffrais. 2008. Auditory-guided reaching movements in the peripersonal frontal space. J. Acoust. Soc. Amer. 123, 5 (May 2008), 3723.
    [21]
    Julie Ducasse, Anke M. Brock, and Christophe Jouffrais. 2018. Accessible interactive maps for visually impaired users. In Mobility of Visually Impaired People. Springer, 537--584.
    [22]
    Julie Ducasse, Marc Macé, Marcos Serrano, and Christophe Jouffrais. 2016. Tangible reels: Construction and exploration of tangible maps by visually impaired users. In Proceedings of the CHI Conference on Human Factors in Computing Systems. ACM, 2186--2197.
    [23]
    Steven Feiner, Blair MacIntyre, Marcus Haupt, and Eliot Solomon. 1993. Windows on the world: 2D windows for 3D augmented reality. In Proceedings of the 6th ACM Symposium on User Interface Software and Technology (UIST’93). ACM, New York, NY, 145--155.
    [24]
    Laura Freina and Michela Ott. 2015. A literature review on immersive virtual reality in education: State-of-the-art and perspectives. eLearn. Softw. Educ.1 (2015).
    [25]
    Doron Friedman. 2015. Brain-computer interfacing and virtual reality. In Handbook of Digital Games and Entertainment Technologies. Springer, 1--22.
    [26]
    Marwan A. Gharaybeh and Grigore C. Burdea. 1994. Investigation of a shape memory alloy actuator for dextrous force-feedback masters. Adv. Robot. 9, 3 (1994), 317--329.
    [27]
    Michael A. Gigante. 1993. Virtual reality: Definitions, history, and applications. In Virtual Reality Systems. Elsevier, 3--14.
    [28]
    Stéphanie Giraud, Anke M. Brock, Marc Macé, and Christophe Jouffrais. 2017. Map learning with a 3D printed interactive small-scale model: Improvement of space and text memorization in visually impaired students. Front. Psychol. 8 (2017), 930.
    [29]
    Nicholas A. Giudice. 2018. Navigating without vision: Principles of blind spatial cognition. In Handbook of Behavioral 8 Cognitive Geography. Edward Elgar Publishing, 1--32.
    [30]
    Reginald G. Golledge. 2003. Human wayfinding and cognitive maps. In The Colonization of Unfamiliar Landscapes. Routledge, 49--54.
    [31]
    T. Gotzelmann. 2016. LucentMaps 3D printed audiovisual tactile maps for blind and visually impaired people. In Proceedings of the International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS’16). 81--90.
    [32]
    João Guerreiro, Dragan Ahmetovic, Kris M. Kitani, and Chieko Asakawa. 2017. Virtual navigation for blind people: Building sequential representations of the real-world. In International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS’17).
    [33]
    João Guerreiro and Daniel Gonçalves. 2014. Text-to-speeches: Evaluating the perception of concurrent speech by blind people. In Proceedings of the 16th International ACM SIGACCESS Conference on Computers 8 Accessibility (ASSETS’14). ACM, New York, NY, 169--176.
    [34]
    Nancy E. Guerrón, Antonio Cobo, José J. Serrano Olmedo, and Carlos Martín. 2019. Sensitive interfaces for blind people in virtual visits inside unknown spaces. Int. J. Hum.-comput. Stud. (Aug. 2019).
    [35]
    Wijnand A. IJsselsteijn, Yvonne A. W. de Kort, and Antal Haans. 2006. Is this my hand I see before me? The rubber hand illusion in reality, virtual reality, and mixed reality. Pres.: Teleop. Virt. Environ. 15, 4 (2006), 455--464.
    [36]
    Hiroo Iwata. 1993. Pen-based haptic virtual environment. In Proceedings of the IEEE Virtual Reality International Symposium. 287--292.
    [37]
    Slim Kammoun, Gaétan Parseihian, Olivier Gutierrez, Adrien Brilhault, Antonio Serpa, Mathieu Raynal, Bernard Oriola, Marc Macé, Malika Auvray, Michel Denis, et al. 2012. Navigation and space perception assistance for the visually impaired: The NAVIG project. IRBM 33, 2 (2012), 182--189.
    [38]
    Julian Kreimeier and Timo Götzelmann. 2018. Real world VR proxies to support blind people in mobility training. In Proceedings of the Mensch Comput. Workshopband.
    [39]
    Julian Kreimeier and Timo Götzelmann. 2019. First steps towards walk-in-place locomotion and haptic feedback in virtual reality for visually impaired. In Proceedings of the CHI Conference on Human Factors in Computing Systems.
    [40]
    Merel Krijn, Paul M. G. Emmelkamp, Roeline Biemond, Claudius de Wilde de Ligny, Martijn J. Schuemie, and Charles A. P. G. van der Mast. 2004. Treatment of acrophobia in virtual reality: The role of immersion and presence. Behav. Res. Ther. 42, 2 (2004), 229--239.
    [41]
    Andreas Kunz, Klaus Miesenberger, Limin Zeng, and Gerhard Weber. 2018. Virtual navigation environment for blind and low vision people. In Proceedings of the International Conference on Computers Helping People with Special Needs. Springer, 114--122.
    [42]
    Orly Lahav and David Mioduser. 2008. Construction of cognitive maps of unknown spaces using a multi-sensory virtual environment for people who are blind. Comput. Hum. Behav. 24, 3 (May 2008), 1139--1155.
    [43]
    Jack M. Loomis, Roberta L. Klatzky, and Nicholas A. Giudice. 2013. Representing 3D space in working memory: Spatial images from vision, hearing, touch, and language. In Multisensory Imagery. Springer, 131--155.
    [44]
    Pedro Lopes, Sijing You, Lung-Pan Cheng, Sebastian Marwecki, and Patrick Baudisch. 2017. Providing haptics to walls 8 heavy objects in virtual reality by means of electrical muscle stimulation. In Proceedings of the CHI Conference on Human Factors in Computing Systems (CHI’17). ACM Press, New York, 1471--1482.
    [45]
    Wendy E. Mackay. 2003. Educating multi-disciplinary design teams. In Proceedings of Tales of the Disappearing Computer. 105--118.
    [46]
    Jonatan Martínez, Arturo García, Miguel Oliver, José Pascual Molina, and Pascual González. 2016. Identifying virtual 3D geometric shapes with a vibrotactile glove. IEEE Comput. Graph. Applic. 36, 1 (2016), 42--51.
    [47]
    Thomas H. Massie, J. Kenneth Salisbury, et al. 1994. The phantom haptic interface: A device for probing virtual objects. In Proceedings of the ASME Winter Annual Meeting, Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Vol. 55. Citeseer, 295--300.
    [48]
    Oussama Metatla, Nick Bryan-Kinns, Tony Stockman, and Fiore Martin. 2015. Designing with and for people living with visual impairments: Audio-tactile mock-ups, audio diaries, and participatory prototyping. CoDesign 11, 1 (2015), 35--48.
    [49]
    Joshua A. Miele, Steven Landau, and Deborah Gilden. 2006. Talking TMAP: Automated generation of audio-tactile maps using Smith-Kettlewell’s TMAP software. Brit. J. Vis. Impair. 24, 2 (2006), 93--100.
    [50]
    Paul Milgram, Herman Colquhoun, et al. 1999. A taxonomy of real and virtual world display integration. Mixed Real. Merg. Real and Virt. Worlds 1 (1999), 1--26.
    [51]
    Paul Milgram and Fumio Kishino. 1994. A taxonomy of mixed reality visual displays. IEICE Trans. Inf. Syst. 77, 12 (1994), 1321--1329.
    [52]
    Roberto A. Montano Murillo, Sriram Subramanian, and Diego Martinez Plasencia. 2017. Erg-O: Ergonomic optimization of immersive virtual environments. In Proceedings of the 30th ACM Symposium on User Interface Software and Technology. ACM, 759--771.
    [53]
    Upul Obeysekare, Chas Williams, Jim Durbin, Larry Rosenblum, Robert Rosenberg, Fernando Grinstein, Ravi Ramamurti, Alexandra Landsberg, and William Sandberg. 1996. Virtual workbench—A non-immersive virtual environment for visualizing and interacting with 3D objects for scientific visualization. In Proceedings of the Visualization Conference. IEEE, 345--349.
    [54]
    Claudio Pacchierotti, Stephen Sinclair, Massimiliano Solazzi, Antonio Frisoli, Vincent Hayward, and Domenico Prattichizzo. 2017. Wearable haptic systems for the fingertip and the hand: Taxonomy, review, and perspectives. IEEE Trans. Haptics 10, 4 (2017), 580--600.
    [55]
    Betsy Phillips and Hongxin Zhao. 1993. Predictors of assistive technology abandonment. Assist. Technol. 5, 1 (1993), 36--45.
    [56]
    Georg Regal, Elke Mattheiss, David Sellitsch, and Manfred Tscheligi. 2018. Mobile location-based games to support orientation 8 mobility training for visually impaired students. In Proceedings of the 20th International Conference on Human-computer Interaction with Mobile Devices and Services. ACM, 47.
    [57]
    Bernhard E. Riecke, Jörg Schulte-Pelkum, Franck Caniard, and Heinrich H. Bulthoff. 2005. Towards lean and elegant self-motion simulation in virtual reality. In Proceedings of the Virtual Reality Conference. IEEE, 131--138.
    [58]
    Joan Sol Roo and Martin Hachet. 2017. Towards a hybrid space combining spatial augmented reality and virtual reality. In Proceedings of the IEEE Symposium on 3D User Interfaces (3DUI’17). https://hal.archives-ouvertes.fr/hal-01453385.
    [59]
    Jaime Sánchez and Claudio Oyarzún. 2011. Mobile audio assistance in bus transportation for the blind. Int. J. Disab. Hum. Dev. 10, 4 (2011), 365--371.
    [60]
    Elizabeth B.-N. Sanders. 2003. From user-centered to participatory design approaches. In Design and the Social Sciences. CRC Press, 18--25.
    [61]
    Stefano Scheggi, Leonardo Meli, Claudio Pacchierotti, and Domenico Prattichizzo. 2015. Touch the virtual reality: Using the Leap Motion controller for hand tracking and wearable tactile devices for immersive haptic rendering. In Proceedings of the ACM Special Interest Group on Computer Graphics and Interactive Techniques Conference (SIGGRAPH’15). ACM, 31.
    [62]
    David W. Schloerb, Orly Lahav, Joseph G. Desloge, and Mandayam A. Srinivasan. 2010. BlindAid: Virtual environment system for self-reliant trip planning and orientation and mobility training. In Proceedings of the Haptics Symposium. 363--370.
    [63]
    Thomas Schubert, Frank Friedmann, and Holger Regenbrecht. 2001. The experience of presence: Factor analytic insights. Pres. Teleop. Virt. Environ. 10, 3 (2001), 266--281.
    [64]
    Alexander W. Siegel and Sheldon H. White. 1975. The development of spatial representations of large-scale environments. In Advances in Child Development and Behavior. Vol. 10. Elsevier, 9--55.
    [65]
    Mel Slater and Maria V. Sanchez-Vives. 2016. Enhancing our lives with immersive virtual reality. Front. Robot. AI 3 (2016), 74.
    [66]
    Mel Slater and Sylvia Wilbur. 1997. A framework for immersive virtual environments (FIVE): Speculations on the role of presence in virtual environments. Pres. Teleop. Virt. Environ. 6, 6 (1997), 603--616.
    [67]
    Frank J. J. M. Steyvers and Aart C. Kooijman. 2009. Using route and survey information to generate cognitive maps: Differences between normally sighted and visually impaired individuals. Appl. Cogn. Psychol. Offic. J. Soc. Appl. Res. Mem. Cogn. 23, 2 (2009), 223--235.
    [68]
    A. F. Tatham. 1991.The design of tactile maps: Theoretical and practical considerations. In Proceedings of the International Cartographic Association: Mapping the Nations, K. Rybaczak and M. Blakemore (Eds.). ICA, London, UK, 157--166.
    [69]
    Shan-Yuan Teng, Tzu-Sheng Kuo, Chi Wang, Chi-huan Chiang, Da-Yuan Huang, Liwei Chan, and Bing-Yu Chen. 2018. PuPoP: Pop-up prop on palm for virtual reality. In Proceedings of the 31st ACM Symposium on User Interface Software and Technology. ACM, 5--17.
    [70]
    Lauren Thevin and Anke M. Brock. 2018. Augmented reality for people with visual impairments: Designing and creating audio-tactile content from existing objects. In Computers Helping People with Special Needs. Springer, Cham, 193--200.
    [71]
    Manos Tsakiris and Patrick Haggard. 2005. The rubber hand illusion revisited: Visuotactile integration and self-attribution.J. Exper. Psychol. Hum. Percept. Perf. 31, 1 (2005), 80.
    [72]
    Simon Ungar. 2000. Cognitive mapping without visual experience. In Cognitive Mapping: Past, Present, and Future, Rob Kitchin and Scott Freundschuh (Eds.). Routledge, Oxon, UK, Chapter 13, 221--248.
    [73]
    Eric Whitmire, Hrvoje Benko, Christian Holz, Eyal Ofek, and Mike Sinclair. 2018. Haptic revolver: Touch, shear, texture, and shape rendering on a reconfigurable virtual reality controller. In Proceedings of the CHI Conference on Human Factors in Computing Systems (CHI’18). ACM, New York, NY, Article 86, 12 pages.
    [74]
    Betsy Williams, Gayathri Narasimham, Bjoern Rump, Timothy P. McNamara, Thomas H. Carr, John Rieser, and Bobby Bodenheimer. 2007. Exploring large virtual environments with an HMD when physical space is limited. In Proceedings of the 4th Symposium on Applied Perception in Graphics and Visualization. ACM, 41--48.
    [75]
    Ye Yuan and Anthony Steed. 2010. Is the rubber hand illusion induced by immersive virtual reality? In Proceedings of the IEEE Virtual Reality Conference (VR’10). 95--102.
    [76]
    Pavel Zahorik, Philbert Bangayan, V. Sundareswaran, Kenneth Wang, and Clement Tam. 2006. Perceptual recalibration in human sound localization: Learning to remediate front-back reversals. J. Acoust. Soc. Amer. 120, 1 (2006), 343--359.
    [77]
    Yuhang Zhao, Cynthia L. Bennett, Hrvoje Benko, Edward Cutrell, Christian Holz, Meredith Ringel Morris, and Mike Sinclair. 2018. Enabling people with visual impairments to navigate virtual reality with a haptic and auditory cane simulation. In Proceedings of the CHI Conference on Human Factors in Computing Systems. ACM, 116.
    [78]
    Yuhang Zhao, Elizabeth Kupferstein, Doron Tal, and Shiri Azenkot. 2018. “It looks beautiful but scary”: How low vision people navigate stairs and other surface level changes. In Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS’18). ACM, New York, 307--320.

    Cited By

    View all
    • (2024)Investigating Virtual Reality Locomotion Techniques with Blind PeopleProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642088(1-17)Online publication date: 11-May-2024
    • (2024)The Effects of Walk-in-Place and Overground Walking on the Acquisition of Spatial Information by People With Visual Impairment in Virtual Reality WayfindingInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2325177(1-19)Online publication date: 13-Mar-2024
    • (2024)A multiplayer VR showdown game for people with visual impairmentHuman–Computer Interaction10.1080/07370024.2024.2342961(1-21)Online publication date: 18-Apr-2024
    • Show More Cited By

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Accessible Computing
    ACM Transactions on Accessible Computing  Volume 13, Issue 2
    June 2020
    184 pages
    ISSN:1936-7228
    EISSN:1936-7236
    DOI:10.1145/3397192
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 25 April 2020
    Accepted: 01 January 2020
    Revised: 01 December 2019
    Received: 01 February 2019
    Published in TACCESS Volume 13, Issue 2

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Accessibility
    2. augmented reality
    3. mobility training
    4. virtual reality
    5. visual impairment

    Qualifiers

    • Research-article
    • Research
    • Refereed

    Funding Sources

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)177
    • Downloads (Last 6 weeks)21

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Investigating Virtual Reality Locomotion Techniques with Blind PeopleProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642088(1-17)Online publication date: 11-May-2024
    • (2024)The Effects of Walk-in-Place and Overground Walking on the Acquisition of Spatial Information by People With Visual Impairment in Virtual Reality WayfindingInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2325177(1-19)Online publication date: 13-Mar-2024
    • (2024)A multiplayer VR showdown game for people with visual impairmentHuman–Computer Interaction10.1080/07370024.2024.2342961(1-21)Online publication date: 18-Apr-2024
    • (2024)Virtual reality in transportation and logistics: A clustering analysis of studies from 2010 to 2023 and future directionsComputers in Human Behavior10.1016/j.chb.2023.108082153(108082)Online publication date: Apr-2024
    • (2024)Empowering Orientation and Mobility Instructors: Digital Tools for Enhancing Navigation Skills in People Who Are BlindComputers Helping People with Special Needs10.1007/978-3-031-62846-7_52(436-443)Online publication date: 5-Jul-2024
    • (2023)Augmented Reality: Current and New Trends in EducationElectronics10.3390/electronics1216353112:16(3531)Online publication date: 21-Aug-2023
    • (2023)A systematic review of extended reality (XR) for understanding and augmenting vision lossJournal of Vision10.1167/jov.23.5.523:5(5)Online publication date: 4-May-2023
    • (2023)Dashboards to support rehabilitation in orientation and mobility for people who are blindProceedings of the 29th Brazilian Symposium on Multimedia and the Web10.1145/3617023.3617027(6-10)Online publication date: 23-Oct-2023
    • (2023)Evoking empathy with visually impaired people through an augmented reality embodiment experience2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR55154.2023.00034(184-193)Online publication date: Mar-2023
    • (2023)The Design Space of the Auditory Representation of Objects and Their Behaviours in Virtual Reality for Blind PeopleIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.324709429:5(2763-2773)Online publication date: 22-Feb-2023
    • Show More Cited By

    View Options

    Get Access

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media