Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2141622.2141691acmotherconferencesArticle/Chapter ViewAbstractPublication PagespetraConference Proceedingsconference-collections
research-article

eyeDog: an assistive-guide robot for the visually impaired

Published: 25 May 2011 Publication History

Abstract

Visually impaired people can navigate unfamiliar areas by relying on the assistance of other people, canes, or specially trained guide dogs. Guide dogs provide the impaired person with the highest degree of mobility and independence, but require expensive training and selective breeding. In this paper we describe the design and development of a prototype assistive-guide robot (eyeDog) that provides the visually impaired person with autonomous vision-based navigation and laser-based obstacle avoidance capabilities. This kind of assistive-guide robot has several advantages, such as robust performance and reduced cost and maintenance. The main components of our system are the Create robotic platform (from iRobot), a net-book, an on-board USB webcam and a LIDAR unit. The camera is used as the primary exteroceptive sensor for the navigation task; the frames captured by the camera are processed in order to robustly estimate the position of the vanishing point associated to the road/corridor where the eyeDog needs to move. The controller will then steer the robot until the vanishing point and the image center coincide. This condition guarantees the robot to move parallel to the direction of the road/corridor. While moving, the robot uses the LIDAR for obstacle avoidance.

References

[1]
Levine, S. P.; Bell, D. A.; Jaros, L. A.; Simpson, R. C.; Koren, Y.; Borenstein, J.;, "The NavChair Assistive Wheelchair Navigation System," Rehabilitation Engineering, IEEE Transactions on, vol.7, no.4, pp.443--451, Dec, 1999.
[2]
"Guide Dogs of America - FAQ", http://www.guidedogsofamerica.org/1/mission/
[3]
"DARPA Grand Challenge", http://www.darpa.mil/grandchallenge/index.asp
[4]
"Mini Grand Challenge Robot Contest", http://www.cede.psu.edu/users/avanzato/robots/contests/outdoor/index.htm
[5]
Macek K., Williams B., Kolski S., "A lane detection vision module for driver assistance", IEEE/APS Conference on Mechatronics and Robotics, Aachen, 2004.
[6]
Malinovskiy Y., Wu Y. and Wang Y., "Video-Based Vehicle Detection and Tracking Using Spatiotemporal Maps", The 88th Annual Transportation Research Board Meeting, pp 81--89, 2009.
[7]
Kim, ZuWhan, "Robust Lane Detection and Tracking in Challenging Scenarios", IEEE Transactions on Intelligent Transportation Systems, 9(1):16--26, March, 2008.
[8]
Lopez A., Canero C., Serrat J., Saludes J., Lumbreras F., and Graf T., "Detection of Lane Markings based on Ridgeness and RANSAC," IEEE Conference on Intelligent Transportation Systems, pp 733--738, 2005.
[9]
Aly M., "Real time detection of lane markers in urban streets," IEEE Intelligent Vehicles Symposium, pp 7--12, 2008.
[10]
Kulyukin V., Gharpure C., and Nicholson J. "RoboCart: Toward Robot-Assisted Navigation of Grocery Stores by the Visually Impaired". IEEE/RSJ International Conference on Intelligent Robots and Systems pp. 2845--2850, Alberta, Canada, 2005.
[11]
Kulyukin V., Gharpure C., Nicholson J., Osborne G. "Robot-Assisted Wayfinding for the Visually Impaired in Structured Indoor Environments". Autonomous Robots, 21(1): 29--41, June 2006.
[12]
Kulyukin V., and Gharpure G. "A Robotic Shopping Assistant for the Blind" The 29-th Annual Conference of the Rehabilitation Engineering and Assistive Technology Society of North America (RESNA 2006), Atlanta, Georgia, 2006.
[13]
Rowe S. and Wagner C. "An Introduction to the Joint Architecture for Unmanned Systems (JAUS)", Technical Report from Cybernet Systems Corporation, http://www.cybernet.com
[14]
Rezai-Rad G. Larijani H. H., "A New Investigation on Edge detection Filters Operation for Feature Extraction under Histogram Equalization Effect" Geometric Modeling and Imaging, pp 149--153, 2007.
[15]
Wong K., "Canny Edge Detection Auto Thresholding", http://www.kerrywong.com/2009/05/07/canny-edgedetection-auto-thresholding, May, 2009.
[16]
Gonzalez RC and Woods RE. Digital Image Processing 3rd Edition. Prentice Hall, NJ. p. 2001.
[17]
Fischler M. A. and Bolles R. C. Random sample consensus: A paradigm for model fitting with application to image analysis and automated cartography. Communications of the ACM, 24(6):381--395, 1981.
[18]
Hartley R. and Zisserman A., "Multiple View Geometry in Computer Vision", Second Edition, Cambridge University Press, March, 2004.
[19]
"Multiple iRobot Create Open Interface (OI) specification", iRobot Corporation, http://www.irobot.com, 2006.
[20]
"The OpenCV library" http://opencv.willowgarage.com/wiki/

Cited By

View all
  • (2024)"We are at the mercy of others' opinion": Supporting Blind People in Recreational Window Shopping with AI-infused TechnologyProceedings of the 21st International Web for All Conference10.1145/3677846.3677860(45-58)Online publication date: 13-May-2024
  • (2023)Laser Sensing and Vision Sensing Smart Blind Cane: A ReviewSensors10.3390/s2302086923:2(869)Online publication date: 12-Jan-2023
  • (2023)A Qualitative and Quantitative Analysis of Research in Mobility Technologies for Visually Impaired PeopleIEEE Access10.1109/ACCESS.2023.329107411(82496-82520)Online publication date: 2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
PETRA '11: Proceedings of the 4th International Conference on PErvasive Technologies Related to Assistive Environments
May 2011
401 pages
ISBN:9781450307727
DOI:10.1145/2141622
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

  • NSF: National Science Foundation
  • Foundation of the Hellenic World
  • ICS-FORTH: Institute of Computer Science, Foundation for Research and Technology - Hellas
  • U of Tex at Arlington: U of Tex at Arlington
  • UCG: University of Central Greece
  • Didaskaleio Konstantinos Karatheodoris, University of the Aegean
  • Fulbrigh, Greece: Fulbright Foundation, Greece
  • Ionian: Ionian University, GREECE

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 25 May 2011

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Hough transform
  2. RANSAC
  3. autonomous assistive robot
  4. vision-based navigation

Qualifiers

  • Research-article

Conference

PETRA '11
Sponsor:
  • NSF
  • ICS-FORTH
  • U of Tex at Arlington
  • UCG
  • Fulbrigh, Greece
  • Ionian

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)38
  • Downloads (Last 6 weeks)5
Reflects downloads up to 25 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)"We are at the mercy of others' opinion": Supporting Blind People in Recreational Window Shopping with AI-infused TechnologyProceedings of the 21st International Web for All Conference10.1145/3677846.3677860(45-58)Online publication date: 13-May-2024
  • (2023)Laser Sensing and Vision Sensing Smart Blind Cane: A ReviewSensors10.3390/s2302086923:2(869)Online publication date: 12-Jan-2023
  • (2023)A Qualitative and Quantitative Analysis of Research in Mobility Technologies for Visually Impaired PeopleIEEE Access10.1109/ACCESS.2023.329107411(82496-82520)Online publication date: 2023
  • (2021)Autonomous vehicles and mobility for people with special needsTransportation Research Part A: Policy and Practice10.1016/j.tra.2021.06.014150(385-397)Online publication date: Aug-2021
  • (2021)Trajectory Correction for Visually Impaired Athletes on 100 m Paralympic RacesComprehensible Science10.1007/978-3-030-85799-8_33(393-401)Online publication date: 28-Aug-2021
  • (2020)Ultrasonic Echolocation Device for Assisting the Visually ImpairedCurrent Medical Imaging Formerly Current Medical Imaging Reviews10.2174/157340561566619042314164716:5(601-610)Online publication date: 28-May-2020
  • (2020)Guiding Blind Pedestrians in Public Spaces by Understanding Walking Behavior of Nearby PedestriansProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/34118254:3(1-22)Online publication date: 4-Sep-2020
  • (2020)Autonomous Visual Navigation for Mobile RobotsACM Computing Surveys10.1145/336896153:1(1-34)Online publication date: 6-Feb-2020
  • (2020)BlindPilot: A Robotic Local Navigation System that Leads Blind People to a Landmark ObjectExtended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3334480.3382925(1-9)Online publication date: 25-Apr-2020
  • (2019)Low Energy Precise Navigation System for the Blind with Infrared Sensors2019 MIXDES - 26th International Conference "Mixed Design of Integrated Circuits and Systems"10.23919/MIXDES.2019.8787093(394-397)Online publication date: Jun-2019
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media