Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3610419.3610460acmotherconferencesArticle/Chapter ViewAbstractPublication PagesairConference Proceedingsconference-collections
research-article

Robust and Scalable Indoor Robot Localization Based on Fusion of Infrastructure Camera Feeds and On-Board Sensors

Published: 02 November 2023 Publication History

Abstract

In this paper, we propose a method to quantify the spatially-varying uncertainty associated with external-camera-based pose estimates of autonomous mobile robots in an indoor setting. We build an observation model for the camera based on an estimate of an upper bound on the uncertainty and demonstrate through experiments how it can be used to fuse information from an on-board lidar to arrive at accurate localization of the robot in feature-poor or changing environments.

References

[1]
Adrian Canedo-Rodriguez, Víctor Álvarez Santos, Carlos Regueiro, Roberto Iglesias Rodriguez, S. Barro, and Jesus Presedo. 2015. Particle filter robot localisation through robust fusion of laser, WiFi, compass, and a network of external cameras. Information Fusion 27 (04 2015). https://doi.org/10.1016/j.inffus.2015.03.006
[2]
Punarjay Chakravarty and Ray A. Jarvis. 2009. External cameras and a mobile robot: A collaborative surveillance system. In IEEE International Conference on Robotics and Automation.
[3]
Alessandro Faralli, Niko Giovannini, Simone Nardi, and Lucia Pallottino. 2016. Indoor Real-Time Localisation for Multiple Autonomous Vehicles Fusing Vision, Odometry and IMU Data. In Modelling and Simulation for Autonomous Systems, Jan Hodicky (Ed.). Springer International Publishing, Cham, 288–297.
[4]
Dieter Fox. 2001. KLD-Sampling: Adaptive Particle Filters. In Advances in Neural Information Processing Systems, T. Dietterich, S. Becker, and Z. Ghahramani (Eds.). Vol. 14. MIT Press.
[5]
Xiao-Shan Gao, Xiao-Rong Hou, Jianliang Tang, and Hang-Fei Cheng. 2003. Complete solution classification for the perspective-three-point problem. IEEE Transactions on Pattern Analysis and Machine Intelligence 25, 8 (2003), 930–943. https://doi.org/10.1109/TPAMI.2003.1217599
[6]
Armin Hornung, Stefan Oßwald, Daniel Maier, and Maren Bennewitz. 2014. Monte Carlo Localization for Humanoid Robot Navigation in Complex Indoor Environments. International Journal of Humanoid Robotics 11 (06 2014). https://doi.org/10.1142/S0219843614410023
[7]
Patricia Javierre, Biel Piero E. Alvarado Vasquez, and Paloma de la Puente. 2019. Particle Filter Localization Using Visual Markers Based Omnidirectional Vision and a Laser Sensor. 2019 Third IEEE International Conference on Robotic Computing (IRC) (2019), 246–249.
[8]
E. Kruse and F.M. Wahl. 1998. Camera-based monitoring system for mobile robot guidance. In Proceedings. 1998 IEEE/RSJ International Conference on Intelligent Robots and Systems. Innovations in Theory, Practice and Applications (Cat. No.98CH36190), Vol. 2. 1248–1253 vol.2. https://doi.org/10.1109/IROS.1998.727470
[9]
Emin Kuscu and Aaron Rasheed Rababaah. 2013. MOBILE ROBOT LOCALIZATION VIA EFFICIENT CALIBRATION TECHNIQUE OF A FIXED REMOTE CAMERA.
[10]
Vincent Lepetit, Francesc Moreno-Noguer, and Pascal Fua. 2009. EPnP: An accurate O(n) solution to the PnP problem. International Journal of Computer Vision 81 (02 2009). https://doi.org/10.1007/s11263-008-0152-6
[11]
Steve Macenski and Ivona Jambrecic. 2021. SLAM Toolbox: SLAM for the dynamic world. Journal of Open Source Software 6, 61 (2021), 2783. https://doi.org/10.21105/joss.02783
[12]
Anca Morar, Alin Moldoveanu, Irina Mocanu, Florica Moldoveanu, Ion Emilian Radoi, Victor Asavei, Alexandru Gradinaru, and Alex Butean. 2020. A Comprehensive Survey of Indoor Localization Methods Based on Computer Vision. Sensors 20, 9 (2020). https://doi.org/10.3390/s20092641
[13]
Christina Ramer, Julian Sessner, Michael Scholz, Xu Zhang, and Jörg Franke. 2015. Fusing low-cost sensor data for localization and mapping of automated guided vehicle fleets in indoor applications. In 2015 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI). 65–70. https://doi.org/10.1109/MFI.2015.7295747
[14]
Nayabrasul Shaik, Matthias Lutz, and Christian Schlegel. 2020. 2D Localization in Large Areas Using Inexpensive RGBD Camera Augmented With Visual Tags. 2020 25th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA) 1 (2020), 613–619.
[15]
Jae Shim and Young Cho. 2016. A Mobile Robot Localization via Indoor Fixed Remote Surveillance Cameras †. Sensors 16 (02 2016), 195. https://doi.org/10.3390/s16020195
[16]
Retnam Visvanathan, Syed Muhammad Mamduh, Kamarulzaman Kamarudin, Ahmad Shakaff Ali Yeon, Ammar Zakaria, Ali Yeon Md Shakaff, Latifah Munirah Kamarudin, and Fathinul Syahir Ahmad Saad. 2015. Mobile robot localization system using multiple ceiling mounted cameras. 2015 IEEE SENSORS (2015), 1–4.
[17]
Stefan Zickler, Tim Laue, Oliver Birbach, Mahisorn Wongphati, and Manuela Veloso. 2010. SSL-Vision: The Shared Vision System for the RoboCup Small Size League. In RoboCup 2009: Robot Soccer World Cup XIII, Jacky Baltes, Michail G. Lagoudakis, Tadashi Naruse, and Saeed Shiry Ghidary (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 425–436.

Cited By

View all
  • (2024)Zero-Shot Pose Estimation and Tracking of Autonomous Mobile Robots using Infrastructure Vision Sensors - An End-to-End Perception FrameworkProceedings of the Fifteenth Indian Conference on Computer Vision Graphics and Image Processing10.1145/3702250.3702269(1-9)Online publication date: 13-Dec-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
AIR '23: Proceedings of the 2023 6th International Conference on Advances in Robotics
July 2023
583 pages
ISBN:9781450399807
DOI:10.1145/3610419
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 November 2023

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. camera measurement uncertainty
  2. external camera-based pose
  3. homography
  4. indoor robot localization
  5. lidar-camera fusion

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

AIR 2023

Acceptance Rates

Overall Acceptance Rate 69 of 140 submissions, 49%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)29
  • Downloads (Last 6 weeks)1
Reflects downloads up to 28 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Zero-Shot Pose Estimation and Tracking of Autonomous Mobile Robots using Infrastructure Vision Sensors - An End-to-End Perception FrameworkProceedings of the Fifteenth Indian Conference on Computer Vision Graphics and Image Processing10.1145/3702250.3702269(1-9)Online publication date: 13-Dec-2024

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media