Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3568162.3578630acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
research-article
Open access

Exploring Levels of Control for a Navigation Assistant for Blind Travelers

Published: 13 March 2023 Publication History

Abstract

Only a small percentage of blind and low-vision people use traditional mobility aids such as a cane or a guide dog. Various assistive technologies have been proposed to address the limitations of traditional mobility aids. These devices often give either the user or the device majority of the control. In this work, we explore how varying levels of control affect the users' sense of agency, trust in the device, confidence, and successful navigation. We present Glide, a novel mobility aid with two modes for control: Glide-directed and User-directed. We employ Glide in a study (N=9) in which blind or low-vision participants used both modes to navigate through an indoor environment. Overall, participants found that Glide was easy to use and learn. Most participants trusted Glide despite its current limitations, and their confidence and performance increased as they continued to use Glide. Users' control mode preferences varied in different situations; no single mode "won" in all situations.

Supplementary Material

MP4 File (HRI23-fp1087.mp4)
Presentation video for paper titled "Exploring Levels of Control for a Navigation Assistant for Blind Travelers"

References

[1]
D. Ahmetovic, C. Gleason, C. Ruan, K. Kitani, H. Takagi, and C. Asakawa. 2016. NavCog: a navigational cognitive assistant for the blind. In Conf. on Human-Computer Interaction with Mobile Devices and Services. 90--99.
[2]
P. Aigner and B. McCarragher. 1999. Shared control framework applied to a robotic aid for the blind. Control Systems, Vol. 19, 2 (1999), 40--46.
[3]
N. Fallah, I. Apostolopoulos, K. Bekris, and E. Folmer. 2012. The user as a sensor: navigating users with visual impairments in indoor spaces using tactile landmarks. In CHI '12. 425--432.
[4]
A. Fiannaca, I. Apostolopoulous, and E. Folmer. 2014. Headlock: a wearable navigation aid that helps blind cane users traverse large open spaces. In Proc. of SIGACCESS conf. on Computers & accessibility. ACM, 19--26.
[5]
G. Fusco, S.A. Cheraghi, L. Neat, and J.M. Coughlan. 2020. An indoor navigation app using computer vision and sign recognition. In Conf. on Computers Helping People with Special Needs. Springer, 485--494.
[6]
S. Gallo, D. Chapuis, L. Santos-Carreras, Y. Kim, P. Retornaz, H. Bleuler, and R. Gassert. 2010. Augmented white cane with multimodal haptic feedback. In RAS & EMBS Conf. on Biomedical Robotics and Biomechatronics. IEEE, 149--155.
[7]
J. Guerreiro, D. Sato, S. Asakawa, H. Dong, K. M Kitani, and C. Asakawa. 2019. Cabot: Designing and evaluating an autonomous navigation robot for blind people. In SIGACCESS conference on computers and accessibility. ACM, 68--82.
[8]
W.H. Jacobson. 1993. The art and science of teaching orientation and mobility to persons with visual impairments. American Foundation for the Blind.
[9]
S. Kayukawa, T. Ishihara, H. Takagi, S. Morishima, and C. Asakawa. 2020. Guiding blind pedestrians in public spaces by understanding walking behavior of nearby pedestrians. Proc. of Interactive, Mobile, Wearable and Ubiquitous Technologies, Vol. 4, 3 (2020), 1--22.
[10]
M. Kuribayashi, S. Kayukawa, J. Vongkulbhisal, C. Asakawa, D. Sato, H. Takagi, and S. Morishima. 2022. Corridor-Walker: Mobile Indoor Walking Assistance for Blind People to Avoid Obstacles and Recognize Intersections. Proc. of Human-Computer Interaction, Vol. 6, MHCI (2022), 1--22.
[11]
G. Lacey and S. MacNamara. 2000. Context-aware shared control of a robot mobility aid for the elderly blind. Robotics Research, Vol. 19, 11 (2000), 1054--1065.
[12]
M. Martinez, A. Constantinescu, B. Schauerte, D. Koester, and R. Stiefelhagen. 2014. Cognitive evaluation of haptic and audio feedback in short range navigation tasks. In Conf. on Computers for Handicapped Persons. Springer, 128--135.
[13]
R. Mettler. 1998. Cognitive learning theory and cane travel instruction: A new paradigm. DIANE Publishing.
[14]
NFB. 2019. National Federation of the Blind: Blindness Statistics. https://nfb.org/resources/blindness-statistics
[15]
R. Pyun, Ye. Kim, P. Wespe, R. Gassert, and S. Schneller. 2013. Advanced augmented white cane with obstacle height and distance feedback. In Int. Conf. on Rehabilitation Robotics (ICORR). IEEE, 1--6.
[16]
U.R. Roentgen, G.J. Gelderblom, M. Soede, and L.P. De Witte. 2008. Inventory of electronic mobility aids for persons with visual impairments: A literature review. Visual Impairment & Blindness, Vol. 102, 11 (2008), 702--724.
[17]
D. Sato, U. Oh, J. Guerreiro, D. Ahmetovic, K. Naito, H. Takagi, K. M Kitani, and C. Asakawa. 2019. NavCog3 in the wild: Large-scale blind indoor navigation assistant with semantic features. Trans. on Accessible Computing (TACCESS), Vol. 12, 3 (2019), 1--30.
[18]
P. Slade, A. Tambe, and M.J. Kochenderfer. 2021. Multimodal sensing and intuitive steering assistance improve navigation and mobility for people with impaired vision. Science Robotics, Vol. 6, 59 (2021), eabg6594.
[19]
M. Swobodzinski and M. Raubal. 2009. An indoor routing algorithm for the blind: development and comparison to a routing algorithm for the sighted. J. of Geographical Information Science, Vol. 23, 10 (2009), 1315--1343.
[20]
I. Ulrich and J. Borenstein. 2001. The GuideCane-applying mobile robot technologies to assist the visually impaired. Trans. on Systems, Man, and Cybernetics-Part A: Systems and Humans, Vol. 31, 2 (2001), 131--136.
[21]
H-C. Wang, R.K. Katzschmann, S. Teng, B. Araki, L. Giarré, and D. Rus. 2017. Enabling independent navigation for visually impaired people through a wearable vision-based feedback system. In int. conf. on robotics and automation (ICRA). IEEE, 6533--6540.
[22]
C. Ye, S. Hong, X. Qian, and W. Wu. 2016. Co-robotic cane: A new robotic navigation aid for the visually impaired. Systems, Man, and Cybernetics, Vol. 2, 2 (2016), 33--42. io

Cited By

View all
  • (2024)A Smart Cane Based on 2D LiDAR and RGB-D Camera Sensor-Realizing Navigation and Obstacle RecognitionSensors10.3390/s2403087024:3(870)Online publication date: 29-Jan-2024
  • (2024)Physically Assistive Robots: A Systematic Review of Mobile and Manipulator Robots That Physically Assist People with DisabilitiesAnnual Review of Control, Robotics, and Autonomous Systems10.1146/annurev-control-062823-0243527:1(123-147)Online publication date: 10-Jul-2024
  • (2024)Snap&Nav: Smartphone-based Indoor Navigation System For Blind People via Floor Map Analysis and Intersection DetectionProceedings of the ACM on Human-Computer Interaction10.1145/36765228:MHCI(1-22)Online publication date: 24-Sep-2024
  • Show More Cited By

Index Terms

  1. Exploring Levels of Control for a Navigation Assistant for Blind Travelers

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    HRI '23: Proceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction
    March 2023
    631 pages
    ISBN:9781450399647
    DOI:10.1145/3568162
    This work is licensed under a Creative Commons Attribution International 4.0 License.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 13 March 2023

    Check for updates

    Author Tags

    1. assistive navigation
    2. robotics
    3. user study

    Qualifiers

    • Research-article

    Conference

    HRI '23
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 268 of 1,124 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)460
    • Downloads (Last 6 weeks)52
    Reflects downloads up to 01 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)A Smart Cane Based on 2D LiDAR and RGB-D Camera Sensor-Realizing Navigation and Obstacle RecognitionSensors10.3390/s2403087024:3(870)Online publication date: 29-Jan-2024
    • (2024)Physically Assistive Robots: A Systematic Review of Mobile and Manipulator Robots That Physically Assist People with DisabilitiesAnnual Review of Control, Robotics, and Autonomous Systems10.1146/annurev-control-062823-0243527:1(123-147)Online publication date: 10-Jul-2024
    • (2024)Snap&Nav: Smartphone-based Indoor Navigation System For Blind People via Floor Map Analysis and Intersection DetectionProceedings of the ACM on Human-Computer Interaction10.1145/36765228:MHCI(1-22)Online publication date: 24-Sep-2024
    • (2024)Charting User Experience in Physical Human–Robot InteractionACM Transactions on Human-Robot Interaction10.1145/365905813:2(1-29)Online publication date: 28-Jun-2024
    • (2024)Navigating Real-World Challenges: A Quadruped Robot Guiding System for Visually Impaired People in Diverse EnvironmentsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642227(1-18)Online publication date: 11-May-2024
    • (2024)What Is It Like for Visually Impaired Individuals To Touch a Table-Top Humanoid Robot?Companion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3640751(623-627)Online publication date: 11-Mar-2024
    • (2023)Assistive Robots for Persons with Visual Impairments: Current Research and Open ChallengesProceedings of the 16th International Conference on PErvasive Technologies Related to Assistive Environments10.1145/3594806.3596593(413-416)Online publication date: 5-Jul-2023

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media