Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article
Public Access

Variability in Reactions to Instructional Guidance during Smartphone-Based Assisted Navigation of Blind Users

Published: 18 September 2018 Publication History
  • Get Citation Alerts
  • Abstract

    'Turn slightly to the left' the navigational system announces, with the aim of directing a blind user to merge into a corridor. Yet, due to long reaction time, the user turns too late and proceeds into the wrong hallway. Observations of such user behavior in real-world navigation settings motivate us to study the manner in which blind users react to the instructional feedback of a turn-by-turn guidance system. We found little previous work analyzing the extent of the variability among blind users in reaction to different instructional guidance during assisted navigation. To gain insight into how navigational interfaces can be better designed to accommodate the information needs of different users, we conduct a data-driven analysis of reaction variability as defined by motion and timing measures. Based on continuously tracked user motion during real-world navigation with a deployed system, we find significant variability between users in their reaction characteristics. Specifically, the statistical analysis reveals significant variability during the crucial elements of the navigation (e.g., turning and encountering obstacles). With the end-user experience in mind, we identify the need to not only adjust interface timing and content to each user's personal walking pace, but also their individual navigation skill and style. The design implications of our study inform the development of assistive systems which consider such user-specific behavior to ensure successful navigation.

    Supplementary Material

    ohn-bar (ohn-bar.zip)
    Supplemental movie, appendix, image and software files for, Variability in Reactions to Instructional Guidance during Smartphone-Based Assisted Navigation of Blind Users

    References

    [1]
    Aminat Adebiyi, Paige Sorrentino, Shadi Bohlool, Carey Zhang, Mort Arditti, Gregory Goodrich, and James D Weiland. 2017. Assessment of feedback modalities for wearable visual aids in blind mobility. PloS one 12, 2 (2017).
    [2]
    Dragan Ahmetovic, Cole Gleason, Chengxiong Ruan, Kris Kitani, Hironobu Takagi, and Chieko Asakawa. 2016. NavCog: A Navigational Cognitive Assistant for the Blind. In MobileHCI.
    [3]
    Yuta Akasaka and Takehisa Onisawa. 2008. Personalized pedestrian navigation system with subjective preference based route selection. In Intelligent Decision and Policy Making Support Systems. 73--91.
    [4]
    Aries Arditi and Ying Li Tian. 2013. User interface preferences in the design of a camera-based navigation and wayfinding aid. Journal of Visual Impairment 8 Blindness 107, 2 (2013).
    [5]
    M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp. 2002. A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking. IEEE Transactions on Signal Processing 50, 2 (2002), 174--188.
    [6]
    Shiri Azenkot, Catherine Feng, and Maya Cakmak. 2016. Enabling Building Service Robots to Guide Blind People: A Participatory Design Approach. In International Conference on Human Robot Interaction (HRI).
    [7]
    Nikola Banovic and John Krumm. 2018. Warming Up to Cold Start Personalization. Interact. Mob. Wearable Ubiquitous Technol. (IMWUT) 1, 4 (2018), 124:1--124:13.
    [8]
    Alex Black, Jan E Lovie-Kitchin, Russell L Woods, Nicole Arnold, John Byrnes, and Jane Murrish. 1997. Mobility performance with retinitis pigmentosa. Clinical and Experimental Optometry 80, 1 (1997).
    [9]
    Jeffrey R Blum, Mathieu Bouchard, and Jeremy R Cooperstock. 2011. What's around me? Spatialized audio augmented reality for blind users with a smartphone. In Mobile and Ubiquitous Systems: Computing, Networking, and Services.
    [10]
    Agata Brajdic and Robert Harle. 2013. Walk detection and step counting on unconstrained smartphones. In International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp).
    [11]
    Michael Brock and Per Ola Kristensson. 2013. Supporting blind navigation using depth sensing and sonification. In International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp) Adjunct.
    [12]
    Stuart K Card, Thomas P Moran, and Allen Newell. 1980. The keystroke-level model for user performance time with interactive systems. Commun. ACM 23, 7 (1980), 396--410.
    [13]
    Seyed Ali Cheraghi, Vinod Namboodiri, and Laura Walker. 2017. GuideBeacon: Beacon-based indoor wayfinding for the blind, visually impaired, and disoriented. In Pervasive Computing and Communications (PerCom).
    [14]
    Dimitrios Dakopoulos and Nikolaos G Bourbakis. 2010. Wearable obstacle avoidance electronic travel aids for blind: a survey. IEEE Transactions on Systems, Man, and Cybernetics, Part C 40, 1 (2010), 25--35.
    [15]
    Julie Ducasse, Anke M Brock, and Christophe Jouffrais. 2018. Accessible interactive maps for visually impaired users. In Mobility of Visually Impaired People. Springer.
    [16]
    Navid Fallah, Ilias Apostolopoulos, Kostas Bekris, and Eelke Folmer. 2012. The user as a sensor: navigating users with visual impairments in indoor spaces using tactile landmarks. In CHI.
    [17]
    Navid Fallah, Ilias Apostolopoulos, Kostas Bekris, and Eelke Folmer. 2013. Indoor Human Navigation Systems: A Survey. Interacting with Computers 25, 1 (2013), 21.
    [18]
    José Faria, Sérgio Lopes, Hugo Fernandes, Paulo Martins, and João Barroso. 2010. Electronic white cane for blind people navigation assistance. In World Automation Congress (WAC).
    [19]
    A Gallet, M Spigai, and M Hamidi. 2000. Use of vehicle navigation in driver assistance systems. In Intelligent Vehicles Symposium.
    [20]
    Nicholas A. Giudice and Gordon E. Legge. 2008. Blind Navigation and the Role of Technology. John Wiley Sons, Inc.
    [21]
    João Guerreiro, Dragan Ahmetovic, Kris M. Kitani, and Chieko Asakawa. 2017. Virtual navigation for blind people: Building sequential representations of the real-world. In ASSETS.
    [22]
    João Guerreiro, Eshed Ohn-Bar, Dragan Ahmetovic, Kris Kitani, and Chieko Asakawa. 2018. How Context and User Behavior Affect Indoor Navigation Assistance for Blind People. In W4A.
    [23]
    Tim Halverson and Anthony J. Hornof. 2007. A Minimal Model for Predicting Visual Search in Human-computer Interaction. In CHI.
    [24]
    Chris Harrison, Brian Y. Lim, Aubrey Shick, and Scott E. Hudson. 2009. Where to Locate Wearable Displays?: Reaction Time Performance of Visual Alerts from Tip to Toe. In CHI.
    [25]
    Suining He, S-H Gary Chan, Lei Yu, and Ning Liu. 2015. Calibration-free fusion of step counter and wireless fingerprints for indoor localization. In International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp).
    [26]
    Sebastian Hilsenbeck, Dmytro Bobkov, Georg Schroth, Robert Huitl, and Eckehard Steinbach. 2014. Graph-based data fusion of pedometer and Wi-Fi measurements for mobile indoor positioning. In International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp).
    [27]
    Paul Holleis, Friederike Otto, Heinrich Hussmann, and Albrecht Schmidt. {n. d.}. Keystroke-level Model for Advanced Mobile Phone Interaction. In CHI.
    [28]
    Rosen Ivanov. 2010. Indoor navigation system for visually impaired. In International Conference on Computer Systems and Technologies.
    [29]
    Anil K Jain and Richard C Dubes. 1988. Algorithms for clustering data. Prentice-Hall, Inc.
    [30]
    Brit Susan Jensen, Mikael B. Skov, and Nissanthen Thiruravichandran. 2010. Studying driver attention and behaviour for three configurations of GPS navigation in real traffic driving. In CHI.
    [31]
    Hernisa Kacorri, Sergio Mascetti, A. Gerino, D. Ahmetovic, H. Takagi, and C. Asakawa. 2016. Supporting Orientation of People with Visual Impairment: Analysis of Large Scale Usage Data. In ASSETS.
    [32]
    Hernisa Kacorri, Eshed Ohn-Bar, Kris M. Kitani, and Chieko Asakawa. 2018. Environmental Factors in Indoor Navigation Based on Real-World Trajectories of Blind Users. In CHI.
    [33]
    Brian FG Katz, Slim Kammoun, Gaëtan Parseihian, Olivier Gutierrez, Adrien Brilhault, Malika Auvray, Philippe Truillet, Michel Denis, Simon Thorpe, and Christophe Jouffrais. 2012. NAVIG: augmented reality guidance system for the visually impaired. Virtual Reality 16, 4 (2012), 253--269.
    [34]
    Fan Li, Chunshui Zhao, Guanzhong Ding, Jian Gong, Chenxing Liu, and Feng Zhao. 2012. A reliable and accurate indoor localization method using phone inertial sensors. In International Joint Conference on Pervasive and Ubiquitous Computing(UbiComp).
    [35]
    Hui Li, Ying Liu, Jun Liu, Xia Wang, Yujiang Li, and Pei-Luen Patrick Rau. 2010. Extended KLM for Mobile Phone Interaction: A User Study Result. In CHI '10 Extended Abstracts on Human Factors in Computing Systems.
    [36]
    Hong Liu, Jun Wang, Xiangdong Wang, and Yueliang Qian. 2015. iSee: obstacle detection and feedback system for the blind. In International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp) Adjunct.
    [37]
    Richard G. Long and E. W. Hill. 1997. Establishing and maintaining orientation for mobility. Foundations of orientation and mobility (1997).
    [38]
    Dierna Giovanni Luca and Machí Alberto. 2016. Towards accurate indoor localization using iBeacons, fingerprinting and particle filtering. In International Conference on Indoor Positioning and Indoor Navigation (IPIN).
    [39]
    Lu Luo and Bonnie E. John. 2005. Predicting Task Execution Time on Handheld Devices Using the Keystroke-level Model. In CHI Extended Abstracts.
    [40]
    Miroslav Macik, Eva Lorencova, Zdenek Mikovec, and Ondrej Rakusan. 2015. Software architecture for a distributed in-hospital navigation system. In Research in Adaptive and Convergent Systems.
    [41]
    Roberto Manduchi and James M. Coughlan. 2014. The Last Meter: Blind Visual Guidance to a Target. In CHI.
    [42]
    Paul M. Mather. 1976. Computational methods of multivariate analysis in physical geography. John Wiley 8 Sons.
    [43]
    Takashi Nakamura. 1997. Quantitative analysis of gait in the visually impaired. Disability and Rehabilitation 19, 5 (1997), 194--197.
    [44]
    Yeonju Oh, Wei-Liang Kao, and Byung-Cheol Min. 2017. Indoor Navigation Aid System Using No Positioning Technique for Visually Impaired People. In International Conference on Human-Computer Interaction. Springer, 390--397.
    [45]
    E. Ohn-Bar, J. Guerreiro, D. Ahmetovic, K. Kitani, and C. Asakawa. 2018. Modeling expertise in assistive navigation interfaces for blind people. In International Conference on Intelligent User Interfaces (IUI).
    [46]
    Eshed Ohn-Bar, Kris M. Kitani, and Chieko Asakawa. 2018. Personalized dynamics models for adaptive assistive navigation systems. CoRR abs/1804.04118 (2018).
    [47]
    Veljo Otsason, Alex Varshavsky, Anthony LaMarca, and Eyal de Lara. 2005. Accurate GSM Indoor Localization. In International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp).
    [48]
    Sabrina A Panëels, Adriana Olmos, Jeffrey R Blum, and Jeremy R Cooperstock. 2013. Listen to it yourself!: evaluating usability of what's around me? for the blind. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2107--2116.
    [49]
    Eduardo Pérez, Myriam Arrue, Masatomo Kobayashi, Hironobu Takagi, and Chieko Asakawa. 2017. Assessment of Semantic Taxonomies for Blind Indoor Navigation Based on a Shopping Center Use Case. In W4A.
    [50]
    Michael Pettitt, Gary Burnett, and Alan Stevens. 2007. An Extended Keystroke Level Model (KLM) for Predicting the Visual Demand of In-vehicle Information Systems. In CHI.
    [51]
    Timothy Riehle, Shane Anderson, Patrick Lichter, William Whalen, and Nicholas Giudice. 2013. Indoor inertial waypoint navigation for the blind. In International Conference on Engineering in Medicine and Biology.
    [52]
    Daisuke Sato, Uran Oh, Kakuya Naito, Hironobu Takagi, Kris Kitani, and Chieko Asakawa. 2017. NavCog3: An Evaluation of a Smartphone-Based Blind Indoor Navigation Assistant with Semantic Features in a Large-Scale Environment. In ASSETS.
    [53]
    Human scale Localization Platform (HULOP). 2018. BLE localization library for iOS. https://github.com/hulop/.
    [54]
    Victor R Schinazi, Tyler Thrash, and Daniel-Robert Chebat. 2016. Spatial navigation by congenitally blind individuals. Wiley Interdisciplinary Reviews: Cognitive Science 7, 1 (2016), 37--58.
    [55]
    Grace P Soong, Jan E Lovie-Kitchin, and Brian Brown. 2001. Does mobility performance of visually impaired adults improve immediately after orientation and mobility training? Optometry 8 Vision Science 78, 9 (2001).
    [56]
    Sarit Szpiro, Yuhang Zhao, and Shiri Azenkot. 2016. Finding a store, searching for a product: a study of daily challenges of low vision people. In International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp).
    [57]
    Lie Ming Tang and Judy Kay. 2017. Harnessing Long Term Physical Activity Data-How Long-term Trackers Use Data and How an Adherence-based Interface Supports New Insights. Interact. Mob. Wearable Ubiquitous Technol. (IMWUT) 1, 2 (2017), 26:1--26:28.
    [58]
    Henrik Tonn-Eichstädt. 2006. Measuring Website Usability for Visually Impaired People-a Modified GOMS Analysis. In ASSETS.
    [59]
    Chad C. Tossell, Philip Kortum, Clayton Shepard, Ahmad Rahmati, and Lin Zhong. 2012. An empirical analysis of smartphone personalisation: measurement and user variability. Behaviour 8 Information Technology 31, 10 (2012), 995--1010.
    [60]
    Koji Tsukada and Michiaki Yasumura. 2004. ActiveBelt: Belt-type wearable tactile display for directional navigation. In International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp).
    [61]
    Kathleen Turano, Duane Geruschat, and Julie W Stahl. 1998. Mental effort required for walking: effects of retinitis pigmentosa. Optometry 8 Vision Science 75, 12 (1998).
    [62]
    Yonatan Vaizman, Nadir Weibel, and Gert Lanckriet. 2018. Context Recognition In-the-Wild: Unified Model for MultiModal Sensors and Multi-Label Classification. Interact. Mob. Wearable Ubiquitous Technol. (IMWUT) 1, 4 (2018), 168:1--168:22.
    [63]
    Thorsten Völkel and Gerhard Weber. 2008. RouteCheckr: Personalized Multicriteria Routing for Mobility Impaired Pedestrians. In ASSETS. 185--192.
    [64]
    A. Wachaja, P. Agarwal, M. Zink, M. R. Adame, K. Möller, and W. Burgard. 2015. Navigating blind people with a smart walker. In International Conference on Intelligent Robots and Systems (IROS).
    [65]
    Andreas Wachaja, Pratik Agarwal, Mathias Zink, Miguel Reyes Adame, Knut Möller, and Wolfram Burgard. 2017. Navigating blind people with walking impairments using a smart walker. Autonomous Robots 41, 3 (2017), 555--573.
    [66]
    Gang Wang, Xinyi Zhang, Shiliang Tang, Haitao Zheng, and Ben Y. Zhao. 2016. Unsupervised clickstream clustering for user behavior analysis. In CHI.
    [67]
    Hsueh-Cheng Wang, Robert K. Katzschmann, Santani Teng, Brandon Araki, Laura Giarré, and Daniela Rus. 2017. Enabling independent navigation for visually impaired people through a wearable vision-based feedback system. In ICRA.
    [68]
    Michele A Williams, Amy Hurst, and Shaun K Kane. 2013. Pray before you step out: describing personal and situational blind navigation behaviors. In ASSETS.
    [69]
    Han Xu, Zheng Yang, Zimu Zhou, Longfei Shangguan, Ke Yi, and Yunhao Liu. 2016. Indoor localization via multimodal sensing on smartphones. In International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp).
    [70]
    Akiko Yamazaki, Keiichi Yamazaki, Yoshinori Kuno, Matthew Burdelski, Michie Kawashima, and Hideaki Kuzuoka. 2008. Precision Timing in Human-Robot Interaction: Coordination of Head Movement and Utterance. In CHI.

    Cited By

    View all
    • (2024)WatchCap: Improving Scanning Efficiency in People with Low Vision through Compensatory Head Movement StimulationProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36595928:2(1-32)Online publication date: 15-May-2024
    • (2023)Design of Auxiliary Application for the Running Activities of Visually Impaired Individuals and Guide RunnersDesign10.12677/Design.2023.8448808:04(3973-3980)Online publication date: 2023
    • (2023)Exploring the User Experience of an AI-based Smartphone Navigation Assistant for People with Visual ImpairmentsProceedings of the 15th Biannual Conference of the Italian SIGCHI Chapter10.1145/3605390.3605421(1-8)Online publication date: 20-Sep-2023
    • Show More Cited By

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
    Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies  Volume 2, Issue 3
    September 2018
    1536 pages
    EISSN:2474-9567
    DOI:10.1145/3279953
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 18 September 2018
    Accepted: 01 September 2018
    Revised: 01 May 2018
    Received: 01 February 2018
    Published in IMWUT Volume 2, Issue 3

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Indoor navigation
    2. accessibility
    3. blind users
    4. clustering motion patterns
    5. motion analysis
    6. navigation task performance
    7. reaction time
    8. task timing
    9. turn-by-turn navigation

    Qualifiers

    • Research-article
    • Research
    • Refereed

    Funding Sources

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)113
    • Downloads (Last 6 weeks)8
    Reflects downloads up to 12 Aug 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)WatchCap: Improving Scanning Efficiency in People with Low Vision through Compensatory Head Movement StimulationProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36595928:2(1-32)Online publication date: 15-May-2024
    • (2023)Design of Auxiliary Application for the Running Activities of Visually Impaired Individuals and Guide RunnersDesign10.12677/Design.2023.8448808:04(3973-3980)Online publication date: 2023
    • (2023)Exploring the User Experience of an AI-based Smartphone Navigation Assistant for People with Visual ImpairmentsProceedings of the 15th Biannual Conference of the Italian SIGCHI Chapter10.1145/3605390.3605421(1-8)Online publication date: 20-Sep-2023
    • (2023)The BLV App Arcade: a new curated repository and evaluation rubric for mobile applications supporting blindness and low visionDisability and Rehabilitation: Assistive Technology10.1080/17483107.2023.218709419:4(1405-1414)Online publication date: 16-Mar-2023
    • (2022)Beyond the Cane: Describing Urban Scenes to Blind People for Mobility TasksACM Transactions on Accessible Computing10.1145/352275715:3(1-29)Online publication date: 19-Aug-2022
    • (2022)Traveling More Independently: A Study on the Diverse Needs and Challenges of People with Visual or Mobility Impairments in Unfamiliar Indoor EnvironmentsACM Transactions on Accessible Computing10.1145/351425515:2(1-44)Online publication date: 19-May-2022
    • (2022)Misalignment in Semantic User Model Elicitation via Conversational Agents: A Case Study in Navigation Support for Visually Impaired PeopleInternational Journal of Human–Computer Interaction10.1080/10447318.2022.205992538:18-20(1909-1925)Online publication date: 26-Apr-2022
    • (2021)LightGuideProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/34635245:2(1-27)Online publication date: 24-Jun-2021
    • (2021)Personalized Navigation that Links Speaker’s Ambiguous Descriptions to Indoor Objects for Low Vision PeopleUniversal Access in Human-Computer Interaction. Access to Media, Learning and Assistive Environments10.1007/978-3-030-78095-1_30(412-423)Online publication date: 3-Jul-2021
    • (2020)Virtual PavingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/34118144:3(1-25)Online publication date: 4-Sep-2020
    • Show More Cited By

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Get Access

    Login options

    Full Access

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media