Abstract
Most localization algorithms are either range-based or vision-based, but the use of only one type of sensor cannot often ensure successful localization. This paper proposes a particle filter-based localization method that combines the range information obtained from a low-cost IR scanner with the SIFT-based visual information obtained from a monocular camera to robustly estimate the robot pose. The rough estimation of the robot pose by the range sensor can be compensated by the visual information given by the camera and the slow visual object recognition can be overcome by the frequent updates of the range information. Although the bandwidths of the two sensors are different, they can be synchronized by using the encoder information of the mobile robot. Therefore, all data from both sensors are used to estimate the robot pose without time delay and the samples used for estimating the robot pose converge faster than those from either range-based or vision-based localization. This paper also suggests a method for evaluating the state of localization based on the normalized probability of a vision sensor model. Various experiments show that the proposed algorithm can reliably estimate the robot pose in various indoor environments and can recover the robot pose upon incorrect localization.
Similar content being viewed by others
References
Y. J. Lee, T. B. Kwon, and J. B. Song, “SLAM of a Mobile Robot using Thinning-based Topological Information,” International Journal of Control, Automation, and Systems, vol. 5, no. 5, pp. 577–583, October 2007.
J. Gutmann, T. Weigel, and B. Nebel, “A fast, accurate, and robust method for self-localization in polygonal environments using laser range finders,” Advanced Robotics, vol. 14, no. 8, pp. 651–668, February 2001.
J. Kosecka and F. Li, “Vision based topological Markov localization,” Proc. of IEEE Int’l Conf. on Robotics and Automation, pp. 1481–1486, 2004.
X. D. Nguyen, B. J. You, and S. R. Oh, “A simple framework for indoor monocular SLAM,” International Journal of Control, Automation, and Systems, vol. 6, no. 1, pp. 62–75, February 2008.
W. Shang and D. Sun, “Multi-sensory fusion for mobile robot self-localization,” Proc. of IEEE International Mechatronics and Automation, pp. 871–876, 2006.
S. Thimpson and S. Kagami, “Stereo Vision and Sonar Sensor Based View Registration for 2.5 Dimensional Map Generation,” Proc. of IEEE Intl. Conf. on Intelligent Robots and Systems, pp. 3444–3449, 2004.
D. Fox, W. Burgard, F. Dellaert, and S. Thrun, “Monte Carlo Localization: Efficient Position Estimation for Mobile Robots,” Proc. of the Sixteenth National Conference on Artificial Intelligence, pp. 343–349, 1999.
S. Thrun, D. Fox, W. Burgard, and F. Dellaert, “Robust Monte Carlo Localization for Mobile Robots,” Artificial Intelligence Journal, vol. 128, no. 1–2, pp. 99–141, May 2001.
M. Alwan, M. B. Wagner, G. Wasson, and P. Sheth, “Characterization of Infrared Range-Finder PBS-03JN for 2-D Mapping,” Proc. of IEEE Int. Conference on Robotics and Automation, pp. 3936–3941, 2004.
D. G. Lowe, “Distinctive image features from scale invariant keypoints,” Int. Journal of Computer Vision, vol. 60, no 2, pp. 91–110, November 2004.
S. Thrun, W. Burgard and D. Fox, Probability Robotics, MIT Press, 2005.
A. Torralba, K. P. Murphy, W. T. Freeman, and M. A. Rubin, “Context-based vision system for place and object recognition,” Proc. of Int. Conference on Computer Vision, pp. 273–280, 2003.
Author information
Authors and Affiliations
Corresponding author
Additional information
Recommended by Editorial Board member Sooyong Lee under the direction of Editor Hyun Seok Yang. This research was conducted by the Intelligent Robotics Development Program, one of the 21st Century Frontier R&D Programs funded by the Ministry of Knowledge Economy of Korea.
Yong-Ju Lee received the B.S. degree in Mechanical Engineering from Korea University in 2004. He is now a Student for Ph.D. of Mechanical Engineering from Korea University. His research interests include mobile robotics.
Byung-Doo Yim received the B.S. degree in Control and Instrumentation Engineering from Seoul National University of Technology in 2005. Also, he received the M.S. degree in Mechatroncis Engineering from Korea University in 2007. His research interests include mobile robotics.
Jae-Bok Song received the B.S. and M.S. degrees in Mechanical Engineering from Seoul National University in 1983 and 1985, respectively. Also, he received the Ph.D. degree in Mechanical Engineering from MIT in 1992. He is currently a Professor of Mechanical Engineering, Korea University, where he is also the Director of the Intelligent Robotics Laboratory from 1993. His current research interests lie mainly in mobile robotics, safe robot arms, and design/control of intelligent robotic systems.
Rights and permissions
About this article
Cite this article
Lee, YJ., Yim, BD. & Song, JB. Mobile robot localization based on effective combination of vision and range sensors. Int. J. Control Autom. Syst. 7, 97–104 (2009). https://doi.org/10.1007/s12555-009-0112-0
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12555-009-0112-0