Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

RVIO: An Effective Localization Algorithm for Range-Aided Visual-Inertial Odometry System

Published: 02 October 2023 Publication History

Abstract

This paper presents an efficient and accurate range-aided visual-inertial odometry (RVIO) system for the global positioning system denied environment. In particular, the ultra-wideband (UWB) measurements are integrated to reduce the long-term drift of the visual-inertial odometry (VIO) system. Our approach starts with a filter-based scheme to localize the unknown UWB anchor in the local world frame. In particular, a novel surface-based particle filter is proposed to localize the UWB anchors efficiently. When the initialization is complete, the UWB location information is utilized to support the subsequent long-term robot positioning. An observability-constrained optimization approach is developed to combine the visual, inertial, and UWB range measurements. Such a framework takes advantage of both VIO and UWB measurements and is feasible even when the number of observed UWB anchors is below four. Experiments on both simulated and real-world scenes demonstrate the validity and superiority of the proposed system.

References

[1]
T. Qin, P. Li, and S. Shen, “VINS-mono: A robust and versatile monocular visual-inertial state estimator,” IEEE Trans. Robot., vol. 34, no. 4, pp. 1004–1020, Aug. 2018.
[2]
S. Heo, J. Cha, and C. G. Park, “EKF-based visual inertial navigation using sliding window nonlinear optimization,” IEEE Trans. Intell. Transp. Syst., vol. 20, no. 7, pp. 2470–2479, Jul. 2019.
[3]
M. Li and A. I. Mourikis, “High-precision, consistent EKF-based visual-inertial odometry,” Int. J. Robot. Res., vol. 32, no. 6, pp. 690–711, May 2013.
[4]
C. Campos, R. Elvira, J. J. G. Rodríguez, J. M. M. Montiel, and J. D. Tardós, “ORB-SLAM3: An accurate open-source library for visual, visual–inertial, and multimap SLAM,” IEEE Trans. Robot., vol. 37, no. 6, pp. 1874–1890, Dec. 2021.
[5]
L. Xionget al., “G-VIDO: A vehicle dynamics and intermittent GNSS-aided visual-inertial state estimator for autonomous driving,” IEEE Trans. Intell. Transp. Syst., vol. 23, no. 8, pp. 11845–11861, Aug. 2022.
[6]
X. Taoet al., “A multi-sensor fusion positioning strategy for intelligent vehicles using global pose graph optimization,” IEEE Trans. Veh. Technol., vol. 71, no. 3, pp. 2614–2627, Mar. 2022.
[7]
J. Xiong, J. W. Cheong, Z. Xiong, A. G. Dempster, S. Tian, and R. Wang, “Adaptive hybrid robust filter for multi-sensor relative navigation system,” IEEE Trans. Intell. Transp. Syst., vol. 23, no. 8, pp. 11026–11040, Aug. 2022.
[8]
G. De Angelis, A. Moschitta, and P. Carbone, “Positioning techniques in indoor environments based on stochastic modeling of UWB round-trip-time measurements,” IEEE Trans. Intell. Transp. Syst., vol. 17, no. 8, pp. 2272–2281, Aug. 2016.
[9]
H. Xu, L. Wang, Y. Zhang, K. Qiu, and S. Shen, “Decentralized visual-inertial-UWB fusion for relative state estimation of aerial swarm,” in Proc. IEEE Int. Conf. Robot. Autom. (ICRA), May 2020, pp. 8776–8782.
[10]
W. Zhen and S. Scherer, “Estimating the localizability in tunnel-like environments using LiDAR and UWB,” in Proc. Int. Conf. Robot. Autom. (ICRA), May 2019, pp. 4903–4908.
[11]
M. W. Mueller, M. Hamer, and R. D’Andrea, “Fusing ultra-wideband range measurements with accelerometers and rate gyroscopes for quadrocopter state estimation,” in Proc. IEEE Int. Conf. Robot. Autom. (ICRA), May 2015, pp. 1730–1736.
[12]
A. Benini, A. Mancini, and S. Longhi, “An IMU/UWB/vision-based extended Kalman filter for mini-UAV localization in indoor environment using 802.15.4a wireless sensor network,” J. Intell. Robotic Syst., vol. 70, nos. 1–4, pp. 461–476, Apr. 2013.
[13]
T.-M. Nguyen, Z. Qiu, M. Cao, T. H. Nguyen, and L. Xie, “Single landmark distance-based navigation,” IEEE Trans. Control Syst. Technol., vol. 28, no. 5, pp. 2021–2028, Sep. 2020.
[14]
Y. Cao and G. Beltrame, “VIR-SLAM: Visual, inertial, and ranging SLAM for single and multi-robot systems,” Auto. Robots, vol. 45, no. 6, pp. 905–917, Sep. 2021.
[15]
T.-M. Nguyen, T. H. Nguyen, M. Cao, Z. Qiu, and L. Xie, “Integrated UWB-vision approach for autonomous docking of UAVs in GPS-denied environments,” in Proc. Int. Conf. Robot. Autom. (ICRA), May 2019, pp. 9603–9609.
[16]
T.-M. Nguyen, M. Cao, S. Yuan, Y. Lyu, T. H. Nguyen, and L. Xie, “VIRAL-fusion: A visual-inertial-ranging-LiDAR sensor fusion approach,” IEEE Trans. Robot., vol. 38, no. 2, pp. 958–977, Apr. 2022.
[17]
F. J. Perez-Grau, F. Caballero, L. Merino, and A. Viguria, “Multi-modal mapping and localization of unmanned aerial robots based on ultra-wideband and RGB-D sensing,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst. (IROS), Sep. 2017, pp. 3495–3502.
[18]
C. Wang, H. Zhang, T.-M. Nguyen, and L. Xie, “Ultra-wideband aided fast localization and mapping system,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst. (IROS), Sep. 2017, pp. 1602–1609.
[19]
P. Lutz, M. J. Schuster, and F. Steidle, “Visual-inertial SLAM aided estimation of anchor poses and sensor error model parameters of UWB radio modules,” in Proc. 19th Int. Conf. Adv. Robot. (ICAR), Dec. 2019, pp. 739–746.
[20]
T. H. Nguyen, T.-M. Nguyen, and L. Xie, “Tightly-coupled single-anchor ultra-wideband-aided monocular visual odometry system,” in Proc. IEEE Int. Conf. Robot. Autom. (ICRA), May 2020, pp. 665–671.
[21]
T. H. Nguyen, T.-M. Nguyen, and L. Xie, “Tightly-coupled ultra-wideband-aided monocular visual SLAM with degenerate anchor configurations,” Auto. Robots, vol. 44, no. 8, pp. 1519–1534, Nov. 2020.
[22]
T. H. Nguyen, T.-M. Nguyen, and L. Xie, “Range-focused fusion of camera-IMU-UWB for accurate and drift-reduced localization,” IEEE Robot. Autom. Lett., vol. 6, no. 2, pp. 1678–1685, Apr. 2021.
[23]
J. Wang, Z. Meng, and L. Wang, “Efficient probabilistic approach to range-only SLAM with a novel likelihood model,” IEEE Trans. Instrum. Meas., vol. 70, pp. 1–12, 2021.
[24]
S. Agarwal, K. Mierle, and The Ceres Solver Team. (Mar. 2022). Ceres Solver, Version 2.1. [Online]. Available: https://github.com/ceres-solver/ceres-solver
[25]
S. Marano, W. M. Gifford, H. Wymeersch, and M. Z. Win, “NLOS identification and mitigation for localization based on UWB experimental data,” IEEE J. Sel. Areas Commun., vol. 28, no. 7, pp. 1026–1035, Sep. 2010.
[26]
D. Dardari, A. Conti, U. Ferner, A. Giorgetti, and M. Z. Win, “Ranging with ultrawide bandwidth signals in multipath environments,” Proc. IEEE, vol. 97, no. 2, pp. 404–426, Feb. 2009.
[27]
J.-L. Blanco, J. Gonzalez, and J.-A. Fernandez-Madrigal, “A pure probabilistic approach to range-only SLAM,” in Proc. IEEE Int. Conf. Robot. Autom., May 2008, pp. 1436–1441.
[28]
J. D. Hol, F. Dijkstra, H. Luinge, and T. B. Schon, “Tightly coupled UWB/IMU pose estimation,” in Proc. IEEE Int. Conf. Ultra-Wideband, Sep. 2009, pp. 688–692.
[29]
P. Geneva, K. Eckenhoff, W. Lee, Y. Yang, and G. Huang, “OpenVINS: A research platform for visual-inertial estimation,” in Proc. IEEE Int. Conf. Robot. Autom. (ICRA), May 2020, pp. 4666–4672.
[30]
Y. Song, M. Guan, W. P. Tay, C. L. Law, and C. Wen, “UWB/LiDAR fusion for cooperative range-only SLAM,” in Proc. Int. Conf. Robot. Autom. (ICRA), May 2019, pp. 6568–6574.
[31]
J. Peña Queralta, L. Qingqing, F. Schiano, and T. Westerlund, “VIO-UWB-based collaborative localization and dense scene reconstruction within heterogeneous multi-robot systems,” 2020, arXiv:2011.00830.
[32]
J. A. Hesch, D. G. Kottas, S. L. Bowman, and S. I. Roumeliotis, “Consistency analysis and improvement of vision-aided inertial navigation,” IEEE Trans. Robot., vol. 30, no. 1, pp. 158–176, Feb. 2014.
[33]
Y. Yang and G. Huang, “Observability analysis of aided INS with heterogeneous features of points, lines, and planes,” IEEE Trans. Robot., vol. 35, no. 6, pp. 1399–1418, Dec. 2019.

Cited By

View all
  • (2024)HCCNet: Hybrid Coupled Cooperative Network for Robust Indoor LocalizationACM Transactions on Sensor Networks10.1145/366564520:4(1-22)Online publication date: 8-Jul-2024

Index Terms

  1. RVIO: An Effective Localization Algorithm for Range-Aided Visual-Inertial Odometry System
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image IEEE Transactions on Intelligent Transportation Systems
        IEEE Transactions on Intelligent Transportation Systems  Volume 25, Issue 2
        Feb. 2024
        1100 pages

        Publisher

        IEEE Press

        Publication History

        Published: 02 October 2023

        Qualifiers

        • Research-article

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)0
        • Downloads (Last 6 weeks)0
        Reflects downloads up to 01 Sep 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)HCCNet: Hybrid Coupled Cooperative Network for Robust Indoor LocalizationACM Transactions on Sensor Networks10.1145/366564520:4(1-22)Online publication date: 8-Jul-2024

        View Options

        View options

        Get Access

        Login options

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media