Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to content

Fast and robust global registration for terrestrial robots @ ICRA2022

License

Notifications You must be signed in to change notification settings

url-kaist/Quatro

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

44 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Quatro

Official page of "A Single Correspondence Is Enough: Robust Global Registration to Avoid Degeneracy in Urban Environments", which is accepted @ ICRA'22. NOTE that this repository is the re-implementation, so it is not exactly the same as the original version.

Demo

NEWS (May 21 2024)

Now, Quatro is fully supported by TEASER++ library. Those who want a ROS-free version should visit that site, please :)

NEWS (Jan. 27 2023)

  • An improved version is under review in the International Journal of Robotics Research~(IJRR)
  • The codes will be refactored and then updated soon!

Characteristics

  • Single hpp file (include/quatro.hpp). It requires other hpp files, though 😅
  • Intuitive usage based on the hierarchical inheritance of the Registration class in Point Cloud Library (PCL). Quatro can be used as follows:
// After the declaration of Quatro,
quatro.setInputSource(srcMatched);
quatro.setInputTarget(tgtMatched);
Eigen::Matrix4d output;
quatro.computeTransformation(output);
  • Robust global registration performance

    • As shown in the below figures, our method shows the most promising performance compared with other state-of-the-art methods.
    • It could be over-claim, yet our method would be more appropriate for terrestrial mobile platforms. Reducing the degree of freedom (DoF) usually makes algorithms be more robust against errors by projecting them into a lower dimension.
      • E.g. 3D-3D correspondences -> 3D-2D correspondences, LOAM -> LeGO-LOAM
    • In this context, our method is the dimension-reduced version of TEASER++!
KITTI dataset NAVER LABS Loc. dataset

Contributors

  • Beomsoo Kim (as a research intern): qjatn1208@naver.com
  • Daebeom Kim (as a research intern): ted97k@naver.com
  • Hyungtae Lim: shapelim@kaist.ac.kr

ToDo

  • Add ROS support
  • Add demo videos
  • Add preprint paper
  • Add diverse examples for other sensor configuration

Contents

  1. Test Env.
  2. How to Build
  3. How to Run Quatro
  4. Citation

Test Env.

The code is tested successfully at

  • Linux 18.04 LTS
  • ROS Melodic

How to Build

ROS Setting

  1. Install the following dependencies
sudo apt install cmake libeigen3-dev libboost-all-dev
  1. Install ROS on a machine.
  2. Then, build Quatro package and enjoy! :) We use catkin tools
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/src
git clone git@github.com:url-kaist/Quatro.git
cd ~/catkin_ws
catkin build quatro 

Note Quatro requires pmc library, which is automatically installed via 3rdparty/find_dependencies.cmake.

How to Run Quatro

Prerequisites

In this study, the fast point feature histogram (FPFH) is utilized, which is widely used as a conventional descriptor for the registration. However, the original FPFH for a 3D point cloud captured by a 64-channel LiDAR sensor takes tens of seconds, which is too slow. In summary, still, feature extraction & matching is the bottleneck for global registration 😟 (in fact, the accuracy is not very important because Quatro is extremely robust against outliers!).

For this reason, we employ voxel-sampled FPFH, which is preceded by voxel-sampling. This is followed by the correspondence test. In addition, we employ Patchwork, which is the state-of-the-art method for ground segmentation, and image projection to reject some subclusters, which is proposed in Le-GO-LOAM. These modules are not presented in our paper!

Finally, we can reduce the computational time of feature extraction & matching, i.e. the front-end of global registration, from tens of seconds to almost 0.2 sec. The overall pipeline is as follows:

Note

For fine-tuning of the parameters to use this code in your own situations, please refer to config folder. In particular, for fine-tuning of Patchwork, please refer to this Wiki

TL;DR

  1. Download toy pcd bins files

The point clouds are from the KITTI dataset, so these are captured by Velodyne-64-HDE

Toy pcds are automatically downloaded. If there is a problem, follow the below commands:

roscd quatro
cd materials
wget https://urserver.kaist.ac.kr/publicdata/quatro/000540.bin
wget https://urserver.kaist.ac.kr/publicdata/quatro/001319.bin
  1. Launch the roslaunch file as follows:
OMP_NUM_THREADS=4 roslaunch quatro quatro.launch

(Unfortunately, for the first run, it shows a rather slow and imprecise performance. It may be due to multi-thread issues.)

Visualized inner pipelines Source (red), target (green), and the estimated output (blue)

Citation

If our research has been helpful, please cite the below papers:

@article{lim2024quatro++,
  title={Quatro++: Robust global registration exploiting ground segmentation for loop closing in LiDAR SLAM},
  author={Lim, Hyungtae and Kim, Beomsoo and Kim, Daebeom and Mason Lee, Eungchang and Myung, Hyun},
  journal={The International Journal of Robotics Research},
  volume={43},
  number={5},
  pages={685--715},
  year={2024}
}
@article{lim2022quatro,
    title={A Single Correspondence Is Enough: Robust Global Registration to Avoid Degeneracy in Urban Environments},
    author={Lim, Hyungtae and Yeon, Suyong and Ryu, Suyong and Lee, Yonghan and Kim, Youngji and Yun, Jaeseong and Jung, Euigon and Lee, Donghwan and Myung, Hyun},
    booktitle={Proc. IEEE Int. Conf. Robot. Autom.},
    pages={8010--8017},
    year={2022}
    }
@article{lim2021patchwork,
    title={Patchwork: Concentric Zone-based Region-wise Ground Segmentation with Ground Likelihood Estimation Using a 3D LiDAR Sensor},
    author={Lim, Hyungtae and Minho, Oh and Myung, Hyun},
    journal={IEEE Robot. Autom. Lett.},
    volume={6},
    number={4},
    pages={6458--6465},
    year={2021},
    }

Acknowledgment

This work was supported by the Industry Core Technology Development Project, 20005062, Development of Artificial Intelligence Robot Autonomous Navigation Technology for Agile Movement in Crowded Space, funded by the Ministry of Trade, Industry & Energy (MOTIE, Republic of Korea) and by the research project “Development of A.I. based recognition, judgment and control solution for autonomous vehicle corresponding to atypical driving environment,” which is financed from the Ministry of Science and ICT (Republic of Korea) Contract No. 2019-0-00399. The student was supported by the BK21 FOUR from the Ministry of Education (Republic of Korea).

License

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Copyright

  • All codes on this page are copyrighted by KAIST published under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 License. You must attribute the work in the manner specified by the author. You may not use the work for commercial purposes, and you may only distribute the resulting work under the same license if you alter, transform, or create the work.