Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3572921.3572929acmotherconferencesArticle/Chapter ViewAbstractPublication PagesozchiConference Proceedingsconference-collections
research-article

Evaluating micro-guidance sonification methods in manual tasks for Blind and Visually Impaired people

Published: 06 April 2023 Publication History

Abstract

This paper presents a user evaluation of seven sonification methods in two-dimensional (2D) manual micro-guidance tasks, which can be used as building blocks for spatialized audio in Mixed and Virtual Reality to model next-generation guidance aids for the Blind and Visually Impaired (BVI). The methods were tested in comparable interactive sonifications of 2D positions in a series of hand-navigation assessments with BVI and blindfolded sighted users, to validate the different approaches in environments without any visual feedback. Results highlighted that alternation and spatiality can be useful resources in sonified guidance, and that users accustomed to faster-than-regular audio speed replay tend to have more precise performances, while musical literacy only had a performance effect on methods highly dependent on aural skills. Ultimately, this work corroborates the notion that sonification may help BVI users perform better in day-to-day manual micro-guidance tasks such as retrieving items from a pantry, handling kitchen appliances, and properly discarding trash.

Supplementary Material

MP4 File (Evaluating micro-guidance sonification methods in manual tasks for Visually Impaired people [demo].mp4)
Video Demonstration

References

[1]
Dragan Ahmetovic, Federico Avanzini, Adriano Baratè, Cristian Bernareggi, Gabriele Galimberti, Luca A. Ludovico, Sergio Mascetti, and Giorgio Presti. 2019. Sonification of Rotation Instructions to Support Navigation of People with Visual Impairment. In 2019 IEEE International Conference on Pervasive Computing and Communications (PerCom. 1–10. https://doi.org/10.1109/PERCOM.2019.8767407
[2]
Dragan Ahmetovic, Roberto Manduchi, James M. Coughlan, and Sergio Mascetti. 2017. Mind Your Crossings: Mining GIS Imagery for Crosswalk Localization. ACM Trans. Access. Comput. 9, 4, Article 11 (April 2017), 25 pages. https://doi.org/10.1145/3046790
[3]
Robert Albrecht, R. Väänänen, and T. Lokki. 2016. Guided by music: pedestrian and cyclist navigation with route and beacon guidance. Personal and Ubiquitous Computing 20 (2016), 121–145.
[4]
amit barde, matt ward, robert lindeman, and mark billinghurst. 2020. the use of spatialised auditory and visual cues for target acqusition in a search task. journal of the audio engineering society (august 2020).
[5]
Costas Boletsis and Dimitra Chasanidou. 2018. Smart tourism in cities: Exploring urban destinations with audio augmented reality. ACM International Conference Proceeding Series, 515–521. https://doi.org/10.1145/3197768.3201549
[6]
Gregory D. Clemenson, Antonella Maselli, Alexander J. Fiannaca, Amos Miller, and Mar Gonzalez-Franco. 2021. Rethinking GPS navigation: creating cognitive maps through auditory clues. Scientific Reports 11 (apr 2021). https://doi.org/10.1038/s41598-021-87148-4
[7]
Robert F. Cohen, Valerie Haven, Jessica A. Lanzoni, Arthur Meacham, Joelle Skaff, and Michael Wissell. 2006. Using an Audio Interface to Assist Users Who Are Visually Impaired with Steering Tasks. In Proceedings of the 8th International ACM SIGACCESS Conference on Computers and Accessibility (Portland, Oregon, USA) (Assets ’06). Association for Computing Machinery, New York, NY, USA, 119–124. https://doi.org/10.1145/1168987.1169008
[8]
James M. Coughlan, Brandon Biggs, Marc-Aurèle Rivière, and Huiying Shen. 2020. An Audio-Based 3D Spatial Guidance AR System for Blind Users. In Computers Helping People with Special Needs, Klaus Miesenberger, Roberto Manduchi, Mario Covarrubias Rodriguez, and Petr Peňáz (Eds.). Springer International Publishing, Cham, 475–484.
[9]
Marcelo De Paiva Guimarães and Valeria Farinazzo Martins. 2014. A Checklist to Evaluate Augmented Reality Applications. In 2014 XVI Symposium on Virtual and Augmented Reality. 45–52. https://doi.org/10.1109/SVR.2014.17
[10]
Manfred Diaz, Roger Girgis, Thomas Fevens, and Jeremy Cooperstock. 2017. To Veer or Not to Veer: Learning from Experts How to Stay Within the Crosswalk. In 2017 IEEE International Conference on Computer Vision Workshops (ICCVW). 1470–1479. https://doi.org/10.1109/ICCVW.2017.174
[11]
Jens Einsiedler, Oliver Sawade, Bernd Schäufele, Marcus Witzke, and Ilja Radusch. 2012. Indoor micro navigation utilizing local infrastructure-based positioning. In 2012 IEEE Intelligent Vehicles Symposium. 993–998. https://doi.org/10.1109/IVS.2012.6232262
[12]
Tristan C. Endsley, Kelly A. Sprehn, Ryan M. Brill, Kimberly J. Ryan, Emily C. Vincent, and James M. Martin. 2017. Augmented Reality Design Heuristics: Designing for Dynamic Interactions. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 61, 1(2017), 2100–2104. https://doi.org/10.1177/1541931213602007
[13]
Agebson Rocha Façanha, Ticianne Darin, Windson Viana, and Jaime Sánchez. 2020. O&M Indoor Virtual Environments for People Who Are Blind: A Systematic Literature Review. ACM Trans. Access. Comput. 13, 2, Article 9a (Aug. 2020), 42 pages. https://doi.org/10.1145/3395769
[14]
Nathan Gale, Pejman Mirza-Babaei, and Isabel Pedersen. 2015. Heuristic guidelines for wearable augmented reality applications. CHI PLAY 2015 - Proceedings of the 2015 Annual Symposium on Computer-Human Interaction in Play, 529–534. https://doi.org/10.1145/2793107.2810309
[15]
Jerônimo G Grandi, Zekun Cao, Mark Ogren, and Regis Kopper. 2021. Design and Simulation of Next-Generation Augmented Reality User Interfaces in Virtual Reality. In 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). 23–29. https://doi.org/10.1109/VRW52623.2021.00011
[16]
Renan Guarese, Franklin Bastidas, João Becker, Mariane Giambastiani, Yhonatan Iquiapaza, Lennon Macedo, Luciana Nedel, Anderson Maciel, Fabio Zambetta, and Ron Schyndel. 2021. Cooking in the dark: a mixed reality empathy experience for the embodiment of blindness. In Proceedings of the 1st XR in Games Workshop (New York). SBC, Porto Alegre, RS, Brasil. https://doi.org/10.5753/xr_in_games.2021.15680
[17]
Renan Guarese, Franklin Bastidas, João Becker, Mariane Giambastiani, Yhonatan Iquiapaza, Lennon Macedo, Luciana Nedel, Anderson Maciel, Fabio Zambetta, and Ron van Schyndel. 2021. Cooking in the Dark: Exploring Spatial Audio as MR Assistive Technology for the Visually Impaired. In Human-Computer-Interaction – INTERACT 2021, Carmelo Ardito, Rosa Lanzilotti, Alessio Malizia, Helen Petrie, Antonio Piccinno, Giuseppe Desolda, and Kori Inkpen (Eds.). Springer International Publishing, Cham, 318–322.
[18]
J Guerreiro, Daisuke Sato, Dragan Ahmetovic, Eshed Ohn-Bar, Kris Kitani, and Chieko Asakawa. 2019. Virtual Navigation for Blind People: Transferring Route Knowledge to the Real-World. International Journal of Human-Computer Studies 135 (10 2019), 102369. https://doi.org/10.1016/j.ijhcs.2019.102369
[19]
Sandra G. Hart and Lowell E. Staveland. 1988. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. In Human Mental Workload, Peter A. Hancock and Najmedin Meshkati (Eds.). Advances in Psychology, Vol. 52. North-Holland, 139 – 183. https://doi.org/10.1016/S0166-4115(08)62386-9
[20]
Dhruv Jain, Sasa Junuzovic, Eyal Ofek, Mike Sinclair, John Porter, Chris Yoon, Swetha Machanavajhala, and Meredith Ringel Morris. 2021. A Taxonomy of Sounds in Virtual Reality. Association for Computing Machinery, New York, NY, USA, 160–170. https://doi.org/10.1145/3461778.3462106
[21]
Watthanasak Jeamwatthanachai, Mike Wald, and Gary Wills. 2019. Indoor navigation by blind people: Behaviors and challenges in unfamiliar spaces and buildings. British Journal of Visual Impairment 37, 2 (2019), 140–153. https://doi.org/10.1177/0264619619833723 arXiv:https://doi.org/10.1177/0264619619833723
[22]
Redouane Kachach, Pablo Perez, Alvaro Villegas, and Ester Gonzalez-Sosa. 2020. Blindness Visualizer: A Simulated Navigation Experience. Proceedings - 2020 IEEE Conference on Virtual Reality and 3D User Interfaces, VRW 2020, 504–506. https://doi.org/10.1109/VRW50115.2020.00107
[23]
Fatima Zahra Kaghat, Ahmed Azough, Mohammed Fakhour, and Mohammed Meknassi. 2020. A new audio augmented reality interaction and adaptation model for museum visits. Computers and Electrical Engineering 84 (6 2020). https://doi.org/10.1016/j.compeleceng.2020.106606
[24]
Brian Katz, Florian Dramas, Gaëtan Parseihian, Olivier Gutierrez, Slim Kammoun, Adrien Brilhault, Lucie Brunet, Mathieu Gallay, Bernard Oriola, Malika Auvray, Philippe Truillet, Michel Denis, Simon Thorpe, and Christophe Jouffrais. 2012. NAVIG: augmented reality guidance system for the visually impaired. Technology and Disability 24 (11 2012), 163–178. https://doi.org/10.1007/s10055-012-0213-6
[25]
Sulaiman Khan, Shah Nazir, and Habib Ullah Khan. 2021. Analysis of Navigation Assistants for Blind and Visually Impaired People: A Systematic Review. IEEE Access 9(2021), 26712–26734. https://doi.org/10.1109/ACCESS.2021.3052415
[26]
Andrew J. King. 2014. What happens to your hearing if you are born blind?Brain 137, 1 (01 2014), 6–8. https://doi.org/10.1093/brain/awt346 arXiv:https://academic.oup.com/brain/article-pdf/137/1/6/11141097/awt346.pdf
[27]
Hicham Lahdili, Zine El, and Abidine Ismaili. 2019. Micro navigation system for smart parking area. Int. J. Intelligent Enterprise 6 (2019), 25–27. Issue 4.
[28]
Young Hoon Lee and Gérard Medioni. 2015. Wearable RGBD Indoor Navigation System for the Blind. In Computer Vision - ECCV 2014 Workshops, Lourdes Agapito, Michael M. Bronstein, and Carsten Rother (Eds.). Springer International Publishing, Cham, 493–508.
[29]
Laura Lewis, Sarah Sharples, Ed Chandler, and John Worsfold. 2015. Hearing the way: Requirements and preferences for technology-supported navigation aids. Applied Ergonomics 48(2015), 56–69. https://doi.org/10.1016/j.apergo.2014.11.004
[30]
Yang Liu, Noelle RB Stiles, and Markus Meister. 2018. Augmented reality powers a cognitive assistant for the blind. eLife 7 (nov 2018), e37841. https://doi.org/10.7554/eLife.37841
[31]
Jacobus C. Lock, Iain D. Gilchrist, Iain D. Gilchrist, Grzegorz Cielniak, and Nicola Bellotto. 2020. Experimental Analysis of a Spatialised Audio Interface for People with Visual Impairments. ACM Trans. Access. Comput. 13, 4, Article 17 (oct 2020), 21 pages. https://doi.org/10.1145/3412325
[32]
Alexander Marquardt, Christina Trepkowski, Tom David Eibich, Jens Maiero, Ernst Kruijff, and Johannes Schoning. 2020. Comparing Non-Visual and Visual Guidance Methods for Narrow Field of View Augmented Reality Displays. IEEE Transactions on Visualization and Computer Graphics 26 (12 2020), 3389–3401. Issue 12. https://doi.org/10.1109/TVCG.2020.3023605
[33]
Sergio Mascetti, Andrea Gerino, Cristian Bernareggi, and Lorenzo Picinali. 2017. On the Evaluation of Novel Sonification Techniques for Non-Visual Shape Exploration. ACM Trans. Access. Comput. 9, 4, Article 13 (April 2017), 28 pages. https://doi.org/10.1145/3046789
[34]
Sergio Mascetti, Lorenzo Picinali, Andrea Gerino, Dragan Ahmetovic, and Cristian Bernareggi. 2016. Sonification of guidance data during road crossing for people with visual impairments or blindness. International Journal of Human-Computer Studies 85 (2016), 16–26. https://doi.org/10.1016/j.ijhcs.2015.08.003 Data Sonification and Sound Design in Interactive Systems.
[35]
Dario Di Mauro, Davide Maria Calandra, Francesco Cutugno, and Daniela D’auria. 2015. A 3D Audio Augmented Reality System for a Cultural Heritage Management and Fruition. Journal of Digital Information Management 13 (2015). https://doi.org/10.13140/RG.2.1.2171.9520
[36]
Keenan R. May, Briana Sobel, Jeff Wilson, and Bruce N. Walker. 2019. Auditory Displays to Facilitate Object Targeting in 3D Space. In In Proceedings of the 2019 International Conference on Auditory display. International Community for Auditory Display, 155–162. https://doi.org/10.21785/icad2019.008
[37]
Keenan R. May, Brianna J. Tomlinson, Xiaomeng Ma, Phillip Roberts, and Bruce N. Walker. 2020. Spotlights and Soundscapes: On the Design of Mixed Reality Auditory Environments for Persons with Visual Impairment. ACM Trans. Access. Comput. 13, 2, Article 8 (April 2020), 47 pages. https://doi.org/10.1145/3378576
[38]
Mohammadreza Mirzaei, Peter Kán, and Hannes Kaufmann. 2021. Multi-modal Spatial Object Localization in Virtual Reality for Deaf and Hard-of-Hearing People. In 2021 IEEE Virtual Reality and 3D User Interfaces (VR). 588–596. https://doi.org/10.1109/VR50410.2021.00084
[39]
Sandy Murillo. 2015. How Do Blind and Visually Impaired People Get Around?In Chicago Lighthouse Website. https://chicagolighthouse.org/sandys-view/getting-around/ Accessed: 2021-10-29.
[40]
Rabia Murtza, Stephen Monroe, and Robert J. Youmans. 2017. Heuristic Evaluation for Virtual Reality Systems. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 61, 1(2017), 2067–2071. https://doi.org/10.1177/1541931213602000 arXiv:https://doi.org/10.1177/1541931213602000
[41]
John G. Neuhoff. 2019. Is Sonification Doomed to Fail?. In In Proceedings of the 2019 International Conference on Auditory display. International Community for Auditory Display, 327–330. https://doi.org/10.21785/icad2019.069
[42]
John G. Neuhoff, Rebecca Knight, and Joseph Wayand. 2002. Pitch change, sonification and musical expertise: which way is up. In In Proceedings of the 2002 International Conference on Auditory display.
[43]
Samuel Paré, Maxime Bleau, Ismaël Djerourou, Vincent Malotaux, Ron Kupers, and Maurice Ptito. 2021. Spatial navigation with horizontally spatialized sounds in early and late blind individuals. PLoS ONE 16 (02 2021). https://doi.org/10.1371/journal.pone.0247448
[44]
Giorgio Presti, Dragan Ahmetovic, Mattia Ducci, Cristian Bernareggi, Luca Ludovico, Adriano Baratè, Federico Avanzini, and Sergio Mascetti. 2019. WatchOut: Obstacle Sonification for People with Visual Impairment or Blindness. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility(Pittsburgh, PA, USA) (ASSETS ’19). Association for Computing Machinery, New York, NY, USA, 402–413. https://doi.org/10.1145/3308561.3353779
[45]
Emmanouel Rovithis, Nikolaos Moustakas, Andreas Floros, and Kostas Vogklis. 2019. Audio legends: Investigating sonic interaction in an augmented reality audio game. Multimodal Technologies and Interaction 3 (12 2019). Issue 4. https://doi.org/10.3390/mti3040073
[46]
Sonja Rümelin, Enrico Rukzio, and Robert Hardy. 2011. NaviRadar: A Novel Tactile Information Display for Pedestrian Navigation. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (Santa Barbara, California, USA) (UIST ’11). Association for Computing Machinery, New York, NY, USA, 293–302. https://doi.org/10.1145/2047196.2047234
[47]
Ane San Martín and Johan Kildal. 2019. Audio-Visual AR to Improve Awareness of Hazard Zones Around Robots. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI EA ’19). Association for Computing Machinery, New York, NY, USA, 1–6. https://doi.org/10.1145/3290607.3312996
[48]
Daisuke Sato, Uran Oh, João Guerreiro, Dragan Ahmetovic, Kakuya Naito, Hironobu Takagi, Kris M. Kitani, and Chieko Asakawa. 2019. NavCog3 in the Wild: Large-Scale Blind Indoor Navigation Assistant with Semantic Features. ACM Trans. Access. Comput. 12, 3, Article 14 (Aug. 2019), 30 pages. https://doi.org/10.1145/3340319
[49]
Quan Shi. 2018. Crowdsourcing based micro navigation system for visually impaired. Master’s thesis.
[50]
Roland Sigrist, Georg Rauter, Robert Riener, and Peter Wolf. 2013. Augmented visual, auditory, haptic, and multimodal feedback in motor learning: A review. Psychonomic Bulletin & Review 20 (2013), 21–53.
[51]
Marjan Sikora, Mladen Russo, Jurica Derek, and Ante Jurčević. 2018. Soundscape of an archaeological site recreated with audio augmented reality. ACM Transactions on Multimedia Computing, Communications and Applications 14 (8 2018). Issue 3. https://doi.org/10.1145/3230652
[52]
Alexander Skulmowski and Günter Daniel Rey. 2017. Measuring Cognitive Load in Embodied Learning Settings. Frontiers in Psychology 8 (2017), 1191. https://doi.org/10.3389/fpsyg.2017.01191
[53]
Romin W. Tafarodi, Alan B. Milne, and Alyson J. Smith. 1999. The Confidence of Choice: Evidence for an Augmentation Effect on Self-Perceived Performance. Personality and Social Psychology Bulletin 25, 11 (1999), 1405–1416. https://doi.org/10.1177/0146167299259006 arXiv:https://doi.org/10.1177/0146167299259006
[54]
Lauren Thevin, Carine Briant, and Anke M. Brock. 2020. X-Road: Virtual Reality Glasses for Orientation and Mobility Training of People with Visual Impairments. ACM Trans. Access. Comput. 13, 2, Article 7 (April 2020), 47 pages. https://doi.org/10.1145/3377879
[55]
Lauren Thevin and Tonja Machulla. 2020. Three Common Misconceptions about Visual Impairments. Proceedings - 2020 IEEE Conference on Virtual Reality and 3D User Interfaces, VRW 2020, 517–518. https://doi.org/10.1109/VRW50115.2020.00110
[56]
Patricia A Tun, Sandra McCoy, and Arthur Wingfield. 2009. Aging, Hearing Acuity, and the Attentional Costs of Effortful Listening. Psychology and Aging (09 2009), 761–766. https://doi.org/10.1037/a0014802
[57]
Christoph Urbanietz, Gerald Enzner, Alexander Orth, Patrick Kwiatkowski, and Nils Pohl. 2019. A Radar-based Navigation Assistance Device With Binaural Sound Interface for Vision-impaired People. In In Proceedings of the 2019 International Conference on Auditory display. 236–243. https://doi.org/10.21785/icad2019.023
[58]
Vincent van Rheden, Thomas Grah, and Alexander Meschtscherjakov. 2020. Sonification Approaches in Sports in the Past Decade: A Literature Review. In Proceedings of the 15th International Conference on Audio Mostly (Graz, Austria) (AM ’20). Association for Computing Machinery, New York, NY, USA, 199–205. https://doi.org/10.1145/3411109.3411126
[59]
Yolanda Vazquez-Alvarez, Ian Oakley, and Stephen A. Brewster. 2012. Auditory display design for exploration in mobile audio-augmented reality. Personal and Ubiquitous Computing 16 (12 2012), 987–999. Issue 8. https://doi.org/10.1007/s00779-011-0459-0
[60]
Zhiquan Wang, Huimin Liu, Yucong Pan, and Christos Mousas. 2020. Experiencing and Navigating Virtual Reality without Sight. Proceedings - 2020 IEEE Conference on Virtual Reality and 3D User Interfaces, VRW 2020, 519–520. https://doi.org/10.1109/VRW50115.2020.00114
[61]
Frederic L Wightman and Doris J Kistler. 1999. Resolution of front-back ambiguity in spatial hearing by listener and source movement. The Journal of the Acoustical Society of America 105 (apr 1999). https://doi.org/10.1121/1.426899
[62]
Michele A. Williams, Caroline Galbraith, Shaun K. Kane, and Amy Hurst. 2014. "just Let the Cane Hit It": How the Blind and Sighted See Navigation Differently. In Proceedings of the 16th International ACM SIGACCESS Conference on Computers & Accessibility (Rochester, New York, USA) (ASSETS ’14). Association for Computing Machinery, New York, NY, USA, 217–224. https://doi.org/10.1145/2661334.2661380
[63]
Haojie Wu, Daniel H. Ashmead, Haley Adams, and Bobby Bodenheimer. 2018. 3D Sound Rendering in a Virtual Environment to Evaluate Pedestrian Street Crossing Decisions at a Roundabout. In 2018 IEEE 4th VR Workshop on Sonic Interactions for Virtual Environments (SIVE). 1–6. https://doi.org/10.1109/SIVE.2018.8577195
[64]
Wenjie Wu and Stefan Rank. 2015. Audio Feedback Design Principles for Hand Gestures in Audio-Only Games. In Proceedings of the Audio Mostly 2015 on Interaction With Sound (Thessaloniki, Greece) (AM ’15). Association for Computing Machinery, New York, NY, USA, Article 38, 6 pages. https://doi.org/10.1145/2814895.2814925
[65]
A. T. Zijlstra. 2017. Using the HoloLens’ Spatial Sound System to aid the Visually Impaired when Navigating Indoors. Master’s thesis.
[66]
Michael Zöllner, Stephan Huber, Hans-Christian Jetter, and Harald Reiterer. 2011. NAVI – A Proof-of-Concept of a Mobile Navigational Aid for Visually Impaired Based on the Microsoft Kinect. In Human-Computer Interaction – INTERACT 2011, Pedro Campos, Nicholas Graham, Joaquim Jorge, Nuno Nunes, Philippe Palanque, and Marco Winckler (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 584–587.

Cited By

View all
  • (2024)Exploring Audio Interfaces for Vertical Guidance in Augmented Reality via Hand-Based FeedbackIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337204030:5(2818-2828)Online publication date: 4-Mar-2024
  • (2024)Enhanced Accessibility for Mobile Indoor Navigation2024 14th International Conference on Indoor Positioning and Indoor Navigation (IPIN)10.1109/IPIN62893.2024.10786147(1-6)Online publication date: 14-Oct-2024
  • (2023)XR towards tele-guidance: mixing realities in assistive technologies for blind and visually impaired people2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW58643.2023.00074(324-329)Online publication date: Mar-2023
  • Show More Cited By

Index Terms

  1. Evaluating micro-guidance sonification methods in manual tasks for Blind and Visually Impaired people
            Index terms have been assigned to the content through auto-classification.

            Recommendations

            Comments

            Information & Contributors

            Information

            Published In

            cover image ACM Other conferences
            OzCHI '22: Proceedings of the 34th Australian Conference on Human-Computer Interaction
            November 2022
            373 pages
            Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            Published: 06 April 2023

            Permissions

            Request permissions for this article.

            Check for updates

            Author Tags

            1. assistive technologies
            2. blind and visually impaired guidance
            3. micro-guidance
            4. mixed reality accessibility

            Qualifiers

            • Research-article
            • Research
            • Refereed limited

            Conference

            OzCHI '22
            OzCHI '22: 34th Australian Conference on Human-Computer Interaction
            November 29 - December 2, 2022
            ACT, Canberra, Australia

            Acceptance Rates

            Overall Acceptance Rate 362 of 729 submissions, 50%

            Contributors

            Other Metrics

            Bibliometrics & Citations

            Bibliometrics

            Article Metrics

            • Downloads (Last 12 months)85
            • Downloads (Last 6 weeks)9
            Reflects downloads up to 05 Mar 2025

            Other Metrics

            Citations

            Cited By

            View all
            • (2024)Exploring Audio Interfaces for Vertical Guidance in Augmented Reality via Hand-Based FeedbackIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337204030:5(2818-2828)Online publication date: 4-Mar-2024
            • (2024)Enhanced Accessibility for Mobile Indoor Navigation2024 14th International Conference on Indoor Positioning and Indoor Navigation (IPIN)10.1109/IPIN62893.2024.10786147(1-6)Online publication date: 14-Oct-2024
            • (2023)XR towards tele-guidance: mixing realities in assistive technologies for blind and visually impaired people2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW58643.2023.00074(324-329)Online publication date: Mar-2023
            • (2023)Evoking empathy with visually impaired people through an augmented reality embodiment experience2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR55154.2023.00034(184-193)Online publication date: Mar-2023
            • (2023)Immersive tele-guidance towards evoking empathy with people who are vision impaired2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)10.1109/ISMAR-Adjunct60411.2023.00179(809-814)Online publication date: 16-Oct-2023

            View Options

            Login options

            View options

            PDF

            View or Download as a PDF file.

            PDF

            eReader

            View online with eReader.

            eReader

            HTML Format

            View this article in HTML Format.

            HTML Format

            Figures

            Tables

            Media

            Share

            Share

            Share this Publication link

            Share on social media