Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Deep Learning Compensation of Rotation Errors During Navigation Assistance for People with Visual Impairments or Blindness

Published: 16 December 2019 Publication History

Abstract

Navigation assistive technologies are designed to support people with visual impairments during mobility. In particular, turn-by-turn navigation is commonly used to provide walk and turn instructions, without requiring any prior knowledge about the traversed environment. To ensure safe and reliable guidance, many research efforts focus on improving the localization accuracy of such instruments. However, even when the localization is accurate, imprecision in conveying guidance instructions to the user and in following the instructions can still lead to unrecoverable navigation errors. Even slight errors during rotations, amplified by the following frontal movement, can result in the user taking an incorrect and possibly dangerous path.
In this article, we analyze trajectories of indoor travels in four different environments, showing that rotation errors are frequent in state-of-art navigation assistance for people with visual impairments. Such errors, caused by the delay between the instruction to stop rotating and when the user actually stops, result in over-rotation. To compensate for over-rotation, we propose a technique to anticipate the stop instruction so that the user stops rotating closer to the target rotation. The technique predicts over-rotation using a deep learning model that takes into account the user’s current rotation speed, duration, and angle; the model is trained with a dataset of rotations performed by blind individuals. By analyzing existing datasets, we show that our approach outperforms a naive baseline that predicts over-rotation with a fixed value. Experiments with 11 blind participants also show that the proposed compensation method results in lower rotation errors (18.8° on average) compared to the non-compensated approach adopted in state-of-the-art solutions (30.1°).

References

[1]
Ali Abdolrahmani, Ravi Kuber, and Amy Hurst. 2016. An empirical investigation of the situationally-induced impairments experienced by blind mobile device users. In Proceedings of the 13th Web for All Conference. ACM, New York, NY, 21.
[2]
Eugene Abravanel and Herbert Gingold. 1977. Perceiving and representing orientation: Effects of the spatial framework. Merrill-Palmer Quarterly of Behavior and Development 23, 4 (1977), 265--278.
[3]
Dragan Ahmetovic, Federico Avanzini, Adriano Baratè, Cristian Bernareggi, Gabriele Galimberti, Luca A. Ludovico, Sergio Mascetti, and Giorgio Presti. 2019. Sonification of rotation instructions to support navigation of people with visual impairment. In Proceedings of the International Conference on Pervasive Computing and Communications. IEEE, Los Alamitos, CA.
[4]
Dragan Ahmetovic, João Guerreiro, Eshed Ohn-Bar, Kris M. Kitani, and Chieko Asakawa. 2019. Impact of expertise on interaction preferences for navigation assistance of visually impaired individuals. In Proceedings of the International Cross-Disciplinary Conference on Web Accessibility.
[5]
Dragan Ahmetovic, Uran Oh, Sergio Mascetti, and Chieko Asakawa. 2018. Turn right: Analysis of rotation errors in turn-by-turn navigation for individuals with visual impairments. In Proceedings of the ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY.
[6]
Ilias Apostolopoulos, Navid Fallah, Eelke Folmer, and Kostas E. Bekris. 2014. Integrated online localization and navigation for people with visual impairments using smart phones. ACM Transactions on Interactive Intelligent Systems 3, 4 (2014), Article 21.
[7]
Saki Asakawa, João Guerreiro, Daisuke Sato, Hironobu Takagi, Dragan Ahmetovic, Desi Gonzalez, Kris M. Kitani, and Chieko Asakawa. 2019. An independent and interactive museum experience for blind people. In Proceedings of the International Cross-Disciplinary Conference on Web Accessibility.
[8]
Hsuan-Eng Chen, Yi-Ying Lin, Chien-Hsing Chen, and I-Fang Wang. 2015. BlindNavi: A navigation app for the visually impaired smartphone user. In Proceedings of CHI’15 Extended Abstracts on Human Factors in Computing Systems (CHI EA’15). ACM, New York, NY, 19--24.
[9]
E. R. Chrastil and W. H. Warren. 2017. Rotational error in path integration: Encoding and execution errors in angle reproduction. Experimental Brain Research 235, 6 (2017), 1885--1897.
[10]
L. Ciaffoni. 2012. Ariadne GPS. Retrieved November 26, 2019 from https://www.ariadnegps.eu/tickets.
[11]
Ádám Csapó, György Wersényi, Hunor Nagy, and Tony Stockman. 2015. A survey of assistive technologies and applications for blind users on mobile platforms: A review and foundation for research. Journal on Multimodal User Interfaces 9, 4 (2015), 275--286.
[12]
DL4J. 2017. Deep Learning for Java Library. Retrieved November 26, 2019 from https://deeplearning4j.org/.
[13]
Elliot P. Fenech, Frank A. Drews, and Jonathan Z. Bakdash. 2010. The effects of acoustic turn-by-turn navigation on wayfinding. In Human Factors and Ergonomics Society. Sage.
[14]
Hugo Fernandes, Paulo Costa, Vitor Filipe, Hugo Paredes, and João Barroso. 2019. A review of assistive spatial orientation and navigation technologies for the visually impaired. Universal Access in the Information Society 18, 1 (2019), 155--168.
[15]
Alexander Fiannaca, Ilias Apostolopoulous, and Eelke Folmer. 2014. Headlock: A wearable navigation aid that helps blind cane users traverse large open spaces. In Proceedings of the ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY.
[16]
Aura Ganz, James Schafer, Siddhesh Gandhi, Elaine Puleo, Carole Wilson, and Meg Robertson. 2012. PERCEPT indoor navigation system for the blind and visually impaired: Architecture and experimentation. International Journal of Telemedicine and Applications 2012 (2012), Article 894869, 12 pages.
[17]
Cole Gleason, Dragan Ahmetovic, Saiph Savage, Carlos Toxtli, Carl Posthuma, Chieko Asakawa, Kris M. Kitani, and Jeffrey P. Bigham. 2018. Crowdsourcing the installation and maintenance of indoor localization infrastructure to support blind navigation. Proceedings of the ACM Conference on Interactive, Mobile, Wearable, and Ubiquitous Technologies.
[18]
Reginald G. Golledge, Roberta L. Klatzky, and Jack M. Loomis. 1996. Cognitive mapping and wayfinding by adults without vision. In The Construction of Cognitive Maps. Springer.
[19]
João Guerreiro, Dragan Ahmetovic, Kris M. Kitani, and Chieko Asakawa. 2017. Virtual navigation for blind people: Building sequential representations of the real-world. In Proceedings of the ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY.
[20]
João Guerreiro, Dragan Ahmetovic, Daisuke Sato, Kris Kitani, and Chieko Asakawa. 2019. Airport accessibility and navigation assistance for people with visual impairments. In Proceedings of the Conference on Human Factors in Computing Systems.
[21]
João Guerreiro, Eshed Ohn-Bar, Dragan Ahmetovic, Kris Kitani, and Chieko Asakawa. 2018. How context and user behavior affect indoor navigation assistance for blind people. In Proceedings of the Web for All Conference. ACM, New York, NY.
[22]
Antonio Gulli and Sujit Pal. 2017. Deep Learning with Keras. Packt Publishing Ltd.
[23]
David Guth. 2007. Why does training reduce blind pedestrians veering. In Blindness and Brain Plasticity in Navigation and Object Perception, J. J. Reiser, D. H. Ashmead, F. Ebner, and A. L. Carn (Eds.). Lawrence Erlbaum Associates, New York, NY, 353--366.
[24]
David Guth and Robert LaDuke. 1994. The veering tendency of blind pedestrians: An analysis of the problem and literature review. Journal of Visual Impairment and Blindness 88 (1994), 391--400.
[25]
S.-C. Huang and Y.-F. Huang. 1991. Bounds on the number of hidden neurons in multilayer perceptrons. IEEE Transactions on Neural Networks 2, 1 (1991), 47--55.
[26]
I. Israël, D. Sievering, and E. Koenig. 1995. Self-rotation estimate about the vertical axis. Acta oto-laryngologica 115, 1 (1995), 3--8.
[27]
R. Jürgens, T. Boss, and W. Becker. 1999. Estimation of self-turning in the dark: Comparison between active and passive rotation. Experimental Brain Research 128, 4 (1999), 491--504.
[28]
Hernisa Kacorri, Sergio Mascetti, Andrea Gerino, Dragan Ahmetovic, Hironobu Takagi, and Chieko Asakawa. 2016. Supporting orientation of people with visual impairment: Analysis of large scale usage data. In Proceedings of the International ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY.
[29]
Hernisa Kacorri, Sergio Mascetti, Andrea Gerino, Ahmetovic Dragan, Valeria Alampi, Hironobu Takagi, and Chieko Asakawa. 2018. Insights on assistive orientation and mobility of people with visual impairment based on large-scale longitudinal data. ACM Transactions on Accessible Computing 11, 1 (2018), Article 5.
[30]
Hernisa Kacorri, Eshed Ohn-Bar, Kris M. Kitani, and Chieko Asakawa. 2018. Environmental factors in indoor navigation based on real-world trajectories of blind users. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI’18). ACM, New York, NY.
[31]
Kapten. 2017. Kapten GPS. Retrieved November 26, 2019 from http://www.kapsys.com/.
[32]
Dae Shik Kim and Robert Wall Emerson. 2014. Effect of cane technique on obstacle detection with the long cane. Journal of Visual Impairment 8 Blindness 108, 4 (2014), 335--340.
[33]
Jee-Eun Kim, Masahiro Bessho, Shinsuke Kobayashi, Noboru Koshizuka, and Ken Sakamura. 2016. Navigating visually impaired travelers in a large train station using smartphone and Bluetooth Low Energy. In Proceedings of the Symposium on Applied Computing. ACM, New York, NY.
[34]
Robert M. Kitchin. 1994. Cognitive maps: What are they and why study them? Journal of Environmental Psychology 14, 1 (1994), 1--19.
[35]
Gordon E. Legge, Paul J. Beckmann, Bosco S. Tjan, Gary Havey, Kevin Kramer, David Rolkosky, Rachel Gage, Muzi Chen, Sravan Puchakayala, and Aravindhan Rangarajan. 2013. Indoor navigation by people with visual impairment using a digital sign system. PLoS One 8, 10 (2013), e76783.
[36]
Roberto Manduchi and Sri Kurniawan. 2011. Mobility-related accidents experienced by people with visual impairment. AER Journal: Research and Practice in Visual Impairment and Blindness 4, 2 (2011), 1--11.
[37]
V. V. Marlinsky. 1999. Vestibular and vestibulo-proprioceptive perception of motion in the horizontal plane in blindfolded man--II. Estimations of rotations about the earth-vertical axis. Neuroscience 90, 2 (1999), 395--401.
[38]
Sergio Mascetti, Dragan Ahmetovic, Andrea Gerino, and Cristian Bernareggi. 2016. ZebraRecognizer: Pedestrian crossing recognition for people with visual impairment or blindness. Pattern Recognition 60 (2016), 405--419.
[39]
Sergio Mascetti, Lorenzo Picinali, Andrea Gerino, Dragan Ahmetovic, and Cristian Bernareggi. 2016. Sonification of guidance data during road crossing for people with visual impairments or blindness. International Journal of Human-Computer Studies 85 (2016), 16--26.
[40]
T. Masters. 1993. Practical Neural Network Recipies in C++. Morgan Kaufmann.
[41]
Masayuki Murata, Dragan Ahmetovic, Daisuke Sato, Hironobu Takagi, Kris M. Kitani, and Chieko Asakawa. 2019. Smartphone-based localization for blind navigation in building-scale indoor environments. Pervasive and Mobile Computing 57 (2019), 14--32.
[42]
Madoka Nakajima and Shinichiro Haruyama. 2012. Indoor navigation system for visually impaired people using visible light communication and compensated geomagnetic sensing. In Proceedings of the 2012 1st IEEE International Conference on Communications in China. IEEE, Los Alamitos, CA.
[43]
Hugo Nicolau, Joaquim Jorge, and Tiago Guerreiro. 2009. Blobby: How to guide a blind person. In Proceedings of CHI’09 Extended Abstracts on Human Factors in Computing Systems (CHI EA’09). ACM, New York, NY.
[44]
Uran Oh, Shaun K. Kane, and Leah Findlater. 2013. Follow that sound: Using sonification and corrective verbal feedback to teach touchscreen gestures. In Proceedings of the ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY.
[45]
Eshed Ohn-Bar, João Guerreiro, Dragan Ahmetovic, Kris M. Kitani, and Chieko Asakawa. 2018. Modeling expertise in assistive navigation interfaces for blind people. In Proceedings of the International Conference on Intelligent User Interfaces. ACM, New York, NY.
[46]
PCB. 2017. PCB Conference Schedule: Pennsylvania Council of the Blind. Retrieved November 25, 2019 from http://pcb1.org/2017-pcb-conference-schedule/.
[47]
Marko Periša, Ivan Cvitić, and Rosana Elizabeta Sente. 2017. Comparative analysis of mobile phone application solutions accessibility for informing visually impaired persons in traffic environment. In Proceedings of the 25th International Symposium on Electronics in Traffic (ISEP'17).
[48]
Giorgio Presti, Dragan Ahmetovic, Mattia Ducci, Cristian Bernareggi, Luca Ludovico, Adriano Baratè, Federico Avanzini, and Sergio Mascetti. 2019. WatchOut: Obstacle sonification for people with visual impairment or blindness. In Proceedings of the ACM SIGACCESS Conference on Computers and Accessibility (ASSETS’19). ACM, New York, NY.
[49]
Jyri Rajamäki, Petri Viinikainen, Julius Tuomisto, Thomas Sederholm, and Miika Säämänen. 2007. LaureaPOP indoor navigation service for the visually impaired in a WLAN environment. In Proceedings of the International Conference on Electronics, Hardware, Wireless, and Optical Communications.
[50]
Timothy H. Riehle, Shane M. Anderson, Patrick A. Lichter, William E. Whalen, and Nicholas A. Giudice. 2013. Indoor inertial waypoint navigation for the blind. In Proceedings of the 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE, Los Alamitos, CA.
[51]
Marti L. Riemer-Reiss and Robbyn R. Wacker. 2000. Factors associated with assistive technology discontinuance among individuals with disabilities. Journal of Rehabilitation 66, 3 (2000), 44--50.
[52]
Alejandro Rituerto, Giovanni Fusco, and James M. Coughlan. 2016. Towards a sign-based indoor navigation system for people with visual impairments. In Proceedings of the ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY.
[53]
Edward K. Sadalla and Daniel R. Montello. 1989. Remembering changes in direction. Environment and Behavior 21, 3 (1989), 346--363.
[54]
Daisuke Sato, Uran Oh, João Guerreiro, Dragan Ahmetovic, Kakuya Naito, Hironobu Takagi, Kris M. Kitani, and Chieko Asakawa. 2019. NavCog3 in the wild: Large-scale blind indoor navigation assistant with semantic features. Transactions on Accessible Computing 12, 3 (2019), Article 14.
[55]
Barry M. Seemungal, Stefan Glasauer, Michael A. Gresty, and Adolfo M. Bronstein. 2007. Vestibular perception and navigation in the congenitally blind. Journal of Neurophysiology 97, 6 (2007), 4341--4356.
[56]
K. Gnana Sheela and Subramaniam N. Deepa. 2013. Review on methods to fix number of hidden neurons in neural networks. Mathematical Problems in Engineering (2013), Article 425740, 11 pages.
[57]
Kristen Shinohara and Jacob O. Wobbrock. 2011. In the shadow of misperception: Assistive technology use and social interactions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’11). ACM, New York, NY.
[58]
Shankar Sivan and Gopu Darsan. 2016. Computer vision based assistive technology for blind and visually impaired people. In Proceedings of the International Conference on Computing Communication and Networking Technologies. ACM, New York, NY.
[59]
D. Stathakis. 2009. How many hidden layers and nodes? International Journal of Remote Sensing 30, 8 (2009), 2133--2147.
[60]
Simon Ungar. 2000. Cognitive mapping without visual experience. Cognitive Mapping: Past, Present, and Future, S. Freundschuh (Ed.). National University of Ireland.
[61]
Tijana Vujicic, Tripo Matijevic, Jelena Ljucovic, Adis Balota, and Zoran Sevarac. 2016. Comparative analysis of methods for determining number of hidden neurons in artificial neural network. In Proceedings of the Central European Conference on Information and Intelligent Systems.
[62]
Richard L. Welsh. 1980. Psychosocial dimensions. Foundations of Orientation and Mobility, R. L. Welsh and B. B. Blasch (Eds.). American Foundation for the Blind, New York, NY, 225--264.
[63]
William R. Wiener, Richard L. Welsh, and Bruce B. Blasch. 2010. Foundations of Orientation and Mobility. American Foundation for the Blind, New York, NY.
[64]
Michele A. Williams, Caroline Galbraith, Shaun K. Kane, and Amy Hurst. 2014. Just let the cane hit it: How the blind and sighted see navigation differently. In Proceedings of the ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY.
[65]
Michele A. Williams, Amy Hurst, and Shaun K. Kane. 2013. Pray before you step out: Describing personal and situational blind navigation behaviors. In Proceedings of the ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY.
[66]
Lucy Yardley and Marsha Higgins. 1998. Spatial updating during rotation: The role of vestibular information and mental activity. Journal of Vestibular Research 8, 6 (1998), 435--442.
[67]
G. L. Zacharias and L. R. Young. 1981. Influence of combined visual and vestibular cues on human perception and control of horizontal rotation. Experimental Brain Research 41, 2 (1981), 159--171.

Cited By

View all
  • (2024)Toward the design of persuasive systems for a healthy workplace: a real-time posture detectionFrontiers in Big Data10.3389/fdata.2024.13599067Online publication date: 17-Jun-2024
  • (2023)Enhancing Wayfinding Experience in Low-Vision Individuals through a Tailored Mobile Guidance InterfaceElectronics10.3390/electronics1222456112:22(4561)Online publication date: 7-Nov-2023
  • (2023)"I Want to Figure Things Out": Supporting Exploration in Navigation for People with Visual ImpairmentsProceedings of the ACM on Human-Computer Interaction10.1145/35794967:CSCW1(1-28)Online publication date: 16-Apr-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Accessible Computing
ACM Transactions on Accessible Computing  Volume 12, Issue 4
Regular Papers and Special Issue on ASSETS 2018
December 2019
158 pages
ISSN:1936-7228
EISSN:1936-7236
DOI:10.1145/3375992
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 16 December 2019
Accepted: 01 October 2019
Revised: 01 August 2019
Received: 01 April 2019
Published in TACCESS Volume 12, Issue 4

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Turn-by-turn navigation
  2. navigation assistance
  3. orientation 8 mobility

Qualifiers

  • Research-article
  • Research
  • Refereed

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)35
  • Downloads (Last 6 weeks)6
Reflects downloads up to 26 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Toward the design of persuasive systems for a healthy workplace: a real-time posture detectionFrontiers in Big Data10.3389/fdata.2024.13599067Online publication date: 17-Jun-2024
  • (2023)Enhancing Wayfinding Experience in Low-Vision Individuals through a Tailored Mobile Guidance InterfaceElectronics10.3390/electronics1222456112:22(4561)Online publication date: 7-Nov-2023
  • (2023)"I Want to Figure Things Out": Supporting Exploration in Navigation for People with Visual ImpairmentsProceedings of the ACM on Human-Computer Interaction10.1145/35794967:CSCW1(1-28)Online publication date: 16-Apr-2023
  • (2023)Sonification of navigation instructions for people with visual impairmentInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103057177(103057)Online publication date: Sep-2023
  • (2021)Inertial Measurement Unit Sensors in Assistive Technologies for Visually Impaired People, a ReviewSensors10.3390/s2114476721:14(4767)Online publication date: 13-Jul-2021
  • (2021)Iterative Design of Sonification Techniques to Support People with Visual Impairments in Obstacle AvoidanceACM Transactions on Accessible Computing10.1145/347064914:4(1-27)Online publication date: 15-Oct-2021
  • (2021)Clew3D: Automated Generation of O&M Instructions Using LIDAR-Equipped SmartphonesProceedings of the 23rd International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3441852.3476564(1-3)Online publication date: 17-Oct-2021
  • (2021)Remote Usage Data Collection and Analysis for Mobile Accessibility Applications2021 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops)10.1109/PerComWorkshops51409.2021.9430968(93-98)Online publication date: 22-Mar-2021
  • (2021)Analysis of Navigation Assistants for Blind and Visually Impaired People: A Systematic ReviewIEEE Access10.1109/ACCESS.2021.30524159(26712-26734)Online publication date: 2021
  • (2021)Personalized Navigation that Links Speaker’s Ambiguous Descriptions to Indoor Objects for Low Vision PeopleUniversal Access in Human-Computer Interaction. Access to Media, Learning and Assistive Environments10.1007/978-3-030-78095-1_30(412-423)Online publication date: 3-Jul-2021
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media