Unmanned Aerial Vehicle Landing on Rugged Terrain by On-Board LIDAR–Camera Positioning System
Abstract
:1. Introduction
- 1.
- A LIDAR–camera system is developed to determine the position of a UAV and create geometric and color terrain maps of its environment, thereby solving the positioning problem in GPS-denied environments.
- 2.
- The proposed method based on LIDAR–camera mapping techniques can identify potential landing sites by fusing weighted energy cost functions of texture and geometric flatness, thereby ensuring terrain safety.
- 3.
- A dataset classified according to scene landing difficulty is established to facilitate UAV landing scene recognition. Additionally, an attention mechanism is added to the network to improve the accuracy and reliability of the scene analysis.
- Section 2 discusses relevant previous work.
- Section 3 introduces the UAV positioning methods.
- Section 4 describes the UAV landing site selection system.
- Section 5 analyzes experimental results to evaluate the feasibility and robustness of the proposed method.
- Section 6 presents the conclusions from this study and discusses future research directions.
2. Related Work
2.1. UAV Landing on Rugged Terrain
2.2. UAV SLAM during the Landing Process
2.3. Deep Learning for Landing Site Recognition
3. Proposed Positioning Framework
3.1. Unmanned Aerial Vehicle
3.2. Sensor Fusion
3.3. Positioning
3.3.1. LIDAR Position
Algorithm 1 Optimize evaluation of and |
|
3.3.2. Camera Position
4. Landing Site Detection
4.1. Construction of the Cost Energy Map
4.2. Place Recognition
4.2.1. Cluster of Alternative Landing Sites
4.2.2. Datasets
4.2.3. Neural Network Architecture
5. Experimental Results
5.1. Simulation Experiments
5.2. Real-World Outdoor Experiments
5.2.1. Costmap Selection of Landing Sites
5.2.2. Place Recognition
5.2.3. Landing Analysis
5.2.4. Runtime Analysis
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Navia, J.; Mondragon, I.; Patino, D.; Colorado, J. Multispectral mapping in agriculture: Terrain mosaic using an autonomous quadcopter UAV. In Proceedings of the 2016 International Conference on Unmanned Aircraft Systems (ICUAS), Arlington, VA, USA, 7–10 June 2016; IEEE: Piscataway, NY, USA, 2016; pp. 1351–1358. [Google Scholar]
- Stefanik, K.V.; Gassaway, J.C.; Kochersberger, K.; Abbott, A.L. UAV-based stereo vision for rapid aerial terrain mapping. GIScience Remote Sens. 2011, 48, 24–49. [Google Scholar] [CrossRef]
- Tomic, T.; Schmid, K.; Lutz, P.; Domel, A.; Kassecker, M.; Mair, E.; Grixa, I.L.; Ruess, F.; Suppa, M.; Burschka, D. Toward a fully autonomous UAV: Research platform for indoor and outdoor urban search and rescue. IEEE Robot. Autom. Mag. 2012, 19, 46–56. [Google Scholar] [CrossRef]
- Nikolic, J.; Burri, M.; Rehder, J.; Leutenegger, S.; Huerzeler, C.; Siegwart, R. A UAV system for inspection of industrial facilities. In Proceedings of the 2013 IEEE Aerospace Conference, Big Sky, MT, USA, 2–9 March 2013; IEEE: Piscataway, NY, USA, 2013; pp. 1–8. [Google Scholar]
- Guo, Y.; Wu, M.; Tang, K.; Tie, J.; Li, X. Covert spoofing algorithm of UAV based on GPS/INS-integrated navigation. IEEE Trans. Veh. Technol. 2019, 68, 6557–6564. [Google Scholar] [CrossRef]
- Zhang, X.; Zhu, F.; Tao, X.; Duan, R. New optimal smoothing scheme for improving relative and absolute accuracy of tightly coupled GNSS/SINS integration. Gps Solut. 2017, 21, 861–872. [Google Scholar] [CrossRef]
- Obu, J.; Lantuit, H.; Grosse, G.; Günther, F.; Sachs, T.; Helm, V.; Fritz, M. Coastal erosion and mass wasting along the Canadian Beaufort Sea based on annual airborne LiDAR elevation data. Geomorphology 2017, 293, 331–346. [Google Scholar] [CrossRef]
- Lin, Y.C.; Cheng, Y.T.; Zhou, T.; Ravi, R.; Hasheminasab, S.M.; Flatt, J.E.; Troy, C.; Habib, A. Evaluation of UAV LiDAR for mapping coastal environments. Remote Sens. 2019, 11, 2893. [Google Scholar] [CrossRef]
- Guo, Q.; Su, Y.; Hu, T.; Zhao, X.; Wu, F.; Li, Y.; Liu, J.; Chen, L.; Xu, G.; Lin, G.; et al. An integrated UAV-borne lidar system for 3D habitat mapping in three forest ecosystems across China. Int. J. Remote Sens. 2017, 38, 2954–2972. [Google Scholar] [CrossRef]
- Suzuki, T.; Inoue, D.; Amano, Y. Robust UAV Position and Attitude Estimation using Multiple GNSS Receivers for Laser-based 3D Mapping. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019. [Google Scholar]
- Guo, Y.; Guo, J.; Liu, C.; Xiong, H.; Chai, L.; He, D. Precision Landing Test and Simulation of the Agricultural UAV on Apron. Sensors 2020, 20, 3369. [Google Scholar] [CrossRef] [PubMed]
- Vidal, V.; Honório, L.; Santos, M.; Silva, M.; Cerqueira, A.; Oliveira, E. UAV vision aided positioning system for location and landing. In Proceedings of the 2017 18th International Carpathian Control Conference (ICCC), Sinaia, Romania, 28–31 May 2017; IEEE: Piscataway, NY, USA, 2017; pp. 228–233. [Google Scholar]
- Rabah, M.; Rohan, A.; Talha, M.; Nam, K.H.; Kim, S.H. Autonomous vision-based target detection and safe landing for UAV. Int. J. Control. Autom. Syst. 2018, 16, 3013–3025. [Google Scholar] [CrossRef]
- Hinzmann, T.; Stastny, T.; Cadena, C.; Siegwart, R.; Gilitschenski, I. Free LSD: Prior-free visual landing site detection for autonomous planes. IEEE Robot. Autom. Lett. 2018, 3, 2545–2552. [Google Scholar] [CrossRef]
- Forster, C.; Faessler, M.; Fontana, F.; Werlberger, M.; Scaramuzza, D. Continuous on-board monocular-vision-based elevation mapping applied to autonomous landing of micro aerial vehicles. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; IEEE: Piscataway, NY, USA, 2015; pp. 111–118. [Google Scholar]
- Miller, A.; Miller, B.; Popov, A.; Stepanyan, K. UAV landing based on the optical flow videonavigation. Sensors 2019, 19, 1351. [Google Scholar] [CrossRef]
- Cheng, H.W.; Chen, T.L.; Tien, C.H. Motion estimation by hybrid optical flow technology for UAV landing in an unvisited area. Sensors 2019, 19, 1380. [Google Scholar] [CrossRef]
- Zheng, W.; Yi, J.; Xiang, H.; Zhou, B.; Wang, D.; Zhao, C. A Study for UAV Autonomous Safe Landing-Site Selection on Rough Terrain. In Proceedings of the The 2nd International Conference on Computing and Data Science, Stanford, CA, USA, 28–30 January 2021; pp. 1–7. [Google Scholar]
- Yang, L.; Wang, C.; Wang, L. Autonomous UAVs landing site selection from point cloud in unknown environments. ISA Trans. 2022, 130, 610–628. [Google Scholar] [CrossRef]
- Loureiro, G.; Dias, A.; Martins, A.; Almeida, J. Emergency landing spot detection algorithm for unmanned aerial vehicles. Remote Sens. 2021, 13, 1930. [Google Scholar] [CrossRef]
- Singh, J.; Adwani, N.; Kandath, H.; Krishna, K.M. RHFSafeUAV: Real-Time Heuristic Framework for Safe Landing of UAVs in Dynamic Scenarios. In Proceedings of the 2023 International Conference on Unmanned Aircraft Systems (ICUAS), Warsaw, Poland, 6–9 June 2023; IEEE: Piscataway, NY, USA, 2023; pp. 863–870. [Google Scholar]
- Kaljahi, M.A.; Shivakumara, P.; Idris, M.Y.I.; Anisi, M.H.; Lu, T.; Blumenstein, M.; Noor, N.M. An automatic zone detection system for safe landing of UAVs. Expert Syst. Appl. 2019, 122, 319–333. [Google Scholar] [CrossRef]
- Guérin, J.; Delmas, K.; Guiochet, J. Certifying emergency landing for safe urban uav. In Proceedings of the 2021 51st Annual IEEE/IFIP International Conference on Dependable Systems and Networks Workshops (DSN-W), Taipei, Taiwan, 21–24 June 2021; IEEE: Piscataway, NY, USA, 2021; pp. 55–62. [Google Scholar]
- Dhami, H.S.; Ignatyev, D.; Tsourdos, A. Semantic segmentation based mapping systems for the safe and precise landing of flying vehicles. IFAC Pap. Online 2022, 55, 310–315. [Google Scholar] [CrossRef]
- Liu, F.; Shan, J.; Xiong, B.; Fang, Z. A real-time and multi-sensor-based landing area recognition system for uavs. Drones 2022, 6, 118. [Google Scholar] [CrossRef]
- Alam, M.S.; Oluoch, J. A survey of safe landing zone detection techniques for autonomous unmanned aerial vehicles (UAVs). Expert Syst. Appl. 2021, 179, 115091. [Google Scholar] [CrossRef]
- Yan, L.; Qi, J.; Wang, M.; Wu, C.; Xin, J. A safe landing site selection method of UAVs based on LiDAR point clouds. In Proceedings of the 2020 39th Chinese Control Conference (CCC), Shenyang, China, 27–29 July 2020; IEEE: Piscataway, NY, USA, 2020; pp. 6497–6502. [Google Scholar]
- Lee, H.; Cho, S.; Jung, H. Real-time collision-free landing path planning for drone deliveries in urban environments. ETRI J. 2023, 45, 746–757. [Google Scholar] [CrossRef]
- Chen, C.; Chen, S.; Hu, G.; Chen, B.; Chen, P.; Su, K. An auto-landing strategy based on pan-tilt based visual servoing for unmanned aerial vehicle in GNSS-denied environments. Aerosp. Sci. Technol. 2021, 116, 106891. [Google Scholar] [CrossRef]
- Du, H.; Wang, W.; Wang, X.; Wang, Y. Autonomous landing scene recognition based on transfer learning for drones. J. Syst. Eng. Electron. 2023, 34, 28–35. [Google Scholar] [CrossRef]
- González-Trejo, J.; Mercado-Ravell, D.; Becerra, I.; Murrieta-Cid, R. On the visual-based safe landing of UAVs in populated areas: A crucial aspect for urban deployment. IEEE Robot. Autom. Lett. 2021, 6, 7901–7908. [Google Scholar] [CrossRef]
- Shen, K.; Zhuang, Y.; Zhu, Y. Incremental learning-based land mark recognition for mirco-UAV autonomous landing. In Proceedings of the 2020 39th Chinese Control Conference (CCC), Shenyang, China, 27–29 July 2020; IEEE: Piscataway, NY, USA, 2020; pp. 6786–6791. [Google Scholar]
- Kim, C.; Lee, E.M.; Choi, J.; Jeon, J.; Kim, S.; Myung, H. ROLAND: Robust landing of UAV on moving platform using object detection and UWB based extended kalman filter. In Proceedings of the 2021 21st International Conference on Control, Automation and Systems (ICCAS), Jeju, Republic of Korea, 12–15 October 2021; IEEE: Piscataway, NY, USA, 2021; pp. 249–254. [Google Scholar]
- Gao, Y.; Ji, J.; Wang, Q.; Jin, R.; Lin, Y.; Shang, Z.; Cao, Y.; Shen, S.; Xu, C.; Gao, F. Adaptive Tracking and Perching for Quadrotor in Dynamic Scenarios. IEEE Trans. Robot. 2023, 40, 499–519. [Google Scholar] [CrossRef]
- Ioannou, Y.; Taati, B.; Harrap, R.; Greenspan, M. Difference of normals as a multi-scale operator in unorganized point clouds. In Proceedings of the 2012 Second International Conference on 3D Imaging, Modeling, Processing, Visualization & Transmission, Zurich, Switzerland, 13–15 October 2012; IEEE: Piscataway, NY, USA, 2012; pp. 501–508. [Google Scholar]
- Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef]
- Moré, J.J. The Levenberg-Marquardt algorithm: Implementation and theory. In Numerical Analysis; Springer: Berlin/Heidelberg, Germany, 1978; pp. 105–116. [Google Scholar] [CrossRef]
- Labbé, M.; Michaud, F. RTAB-Map as an open-source lidar and visual simultaneous localization and mapping library for large-scale and long-term online operation. J. Field Robot. 2019, 36, 416–446. [Google Scholar] [CrossRef]
- Zhang, J.; Singh, S. LOAM: Lidar Odometry and Mapping in Real-time. In Proceedings of the Robotics: Science and Systems, Berkeley, CA, USA, 12–16 July 2014; Volume 2. [Google Scholar]
- Lin, J.; Zhang, F. Loam livox: A fast, robust, high-precision LiDAR odometry and mapping package for LiDARs of small FoV. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 3126–3131. [Google Scholar] [CrossRef]
- Tsai, C.H.; Lin, Y.C. An accelerated image matching technique for UAV orthoimage registration. ISPRS J. Photogramm. Remote Sens. 2017, 128, 130–145. [Google Scholar] [CrossRef]
- Xia, G.S.; Hu, J.; Hu, F.; Shi, B.; Bai, X.; Zhong, Y.; Zhang, L.; Lu, X. AID: A benchmark data set for performance evaluation of aerial scene classification. IEEE Trans. Geosci. Remote Sens. 2017, 55, 3965–3981. [Google Scholar] [CrossRef]
- Yang, Y.; Newsam, S. Bag-of-visual-words and spatial extensions for land-use classification. In Proceedings of the 18th SIGSPATIAL International Conference on Advances in Geographic Information Systems, San Jose, CA, USA, 2–5 November 2010; pp. 270–279. [Google Scholar]
- Zhou, B.; Lapedriza, A.; Khosla, A.; Oliva, A.; Torralba, A. Places: A 10 million image database for scene recognition. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 40, 1452–1464. [Google Scholar] [CrossRef] [PubMed]
- Woo, S.; Park, J.; Lee, J.Y.; Kweon, I.S. Cbam: Convolutional block attention module. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 3–19. [Google Scholar]
- Gao, J.; Zhang, B.; Wu, Y.; Guo, C. Building Extraction from High Resolution Remote Sensing Images Based on Improved Mask R-CNN. In Proceedings of the 2022 4th International Conference on Robotics and Computer Vision (ICRCV), Wuhan, China, 25–27 September 2022; IEEE: Piscataway, NY, USA, 2022; pp. 1–6. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Zheng, J.; Zhao, Y.; Wu, W.; Chen, M.; Li, W.; Fu, H. Partial domain adaptation for scene classification from remote sensing imagery. IEEE Trans. Geosci. Remote. Sens. 2022, 61, 1–17. [Google Scholar] [CrossRef]
- Liu, X.; Zhou, Y.; Zhao, J.; Yao, R.; Liu, B.; Ma, D.; Zheng, Y. Multiobjective ResNet pruning by means of EMOAs for remote sensing scene classification. Neurocomputing 2020, 381, 298–305. [Google Scholar] [CrossRef]
- Katigbak, C.N.R.; Garcia, J.R.B.; Gutang, J.E.D.; De Villa, J.B.; Alcid, A.D.C.; Vicerra, R.R.P.; Cruz, A.R.D.; Roxas, E.A.; Serrano, K.K.D. Autonomous trajectory tracking of a quadrotor UAV using PID controller. In Proceedings of the 2015 International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment and Management (HNICEM), Cebu, Philippines, 9–12 December 2015; IEEE: Piscataway, NY, USA, 2015; pp. 1–5. [Google Scholar]
(s) | (Points/) | (°) | (°) | (°) | (°) |
---|---|---|---|---|---|
36.11 | 111 |
Method | GPS | Optical Flow [16] | Hybrid Optical Flow [17] | Ours |
---|---|---|---|---|
1.0112 | 2.0538 | 0.5193 | 0.3671 | |
0.5083 | 1.5819 | 0.1828 | 0.1568 |
Network | Dataset | Loss | F1 | Recall | TOP1 Accuracy |
---|---|---|---|---|---|
ResNeXt50 | 0.417 | 0.736 | 0.732 | 0.762 | |
0.362 | 0.791 | 0.782 | 0.839 | ||
Vgg19 | 0.389 | 0.768 | 0.753 | 0.776 | |
0.332 | 0.811 | 0.792 | 0.832 | ||
GoogleNetV4 | 0.468 | 0.681 | 0.668 | 0.756 | |
0.402 | 0.769 | 0.706 | 0.844 | ||
MobileNetV3 | 0.437 | 0.703 | 0.671 | 0.754 | |
0.384 | 0.775 | 0.742 | 0.821 | ||
ResNet18 | 0.521 | 0.621 | 0.548 | 0.703 | |
0.469 | 0.714 | 0.625 | 0.769 | ||
ResNet34 | 0.491 | 0.695 | 0.683 | 0.714 | |
0.412 | 0.743 | 0.714 | 0.801 | ||
ResNet50 | 0.441 | 0.686 | 0.674 | 0.737 | |
0.356 | 0.813 | 0.808 | 0.829 | ||
ResNet101 | 0.398 | 0.765 | 0.761 | 0.793 | |
0.305 | 0.838 | 0.832 | 0.851 | ||
ResNet50 +CBAM | 0.402 | 0.767 | 0.758 | 0.786 | |
0.322 | 0.845 | 0.842 | 0.857 |
Network | ROM (Mb) | RAM (Mb) | RunTime (s) | Loss | F1 | Recall | TOP1 Accuracy |
---|---|---|---|---|---|---|---|
ResNeXt50 | 88 | 901.493 | 0.95 | 0.342 | 0.726 | 0.712 | 0.794 |
Vgg19 | 532 | 896.237 | 0.84 | 0.331 | 0.713 | 0.703 | 0.761 |
GoogleNetV4 | 157 | 887.847 | 1.09 | 0.385 | 0.748 | 0.735 | 0.792 |
MobileNetV3 | 154 | 914.367 | 1.18 | 0.379 | 0.734 | 0.721 | 0.777 |
ResNet18 | 72 | 966.265 | 0.92 | 0.396 | 0.682 | 0.657 | 0.761 |
ResNet34 | 83 | 923.461 | 0.96 | 0.329 | 0.701 | 0.673 | 0.773 |
ResNet50 | 90 | 874.207 | 0.96 | 0.336 | 0.716 | 0.698 | 0.799 |
ResNet101 | 162 | 863.052 | 1.03 | 0.314 | 0.765 | 0.753 | 0.817 |
ResNet50+CBAM | 92 | 855.925 | 0.98 | 0.328 | 0.817 | 0.823 | 0.841 |
(°) | (m) | (m/s) | (m/) | |
---|---|---|---|---|
GPS | 3.11 | 1.41 | 0.32 | 0.21 |
Optical Flow [16] | 2.36 | 1.31 | 0.29 | 0.18 |
Hybrid Optical Flow [17] | 1.83 | 0.54 | 0.17 | 0.15 |
Ours | 1.99 | 0.41 | 0.10 | 0.04 |
(a) Turf | (b) Rooftop Parking Lots | (c) Cottage Area | (d) Footbridge | (e) Swimming Pools | |
---|---|---|---|---|---|
Obstacle distance (m) | 4.52 | 2.12 | 1.58 | 1.86 | 4.33 |
Average level of site selection (°) | 3.52 | 2.32 | 4.11 | 2.45 | 3.18 |
Speed at landing (m/s) | 0.13 | 0.12 | 0.15 | 0.13 | 0.10 |
Type of landing site | Lawn (safe) | Path (safe) | Road (safe) | Road (risky) | Field (safe) |
Algorithm | Time Cost (ms) |
---|---|
Down simple | 101.75 ± 5.49 |
Flatness | 103.35 ± 8.27 |
Smooth canny | 8.64 ± 2.51 |
Euclidean distance | 7.17 ± 2.93 |
Places classification | 3920.21 ± 54.04 |
Total | 4141.12 ± 73.24 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zou, C.; Sun, Y.; Kong, L. Unmanned Aerial Vehicle Landing on Rugged Terrain by On-Board LIDAR–Camera Positioning System. Appl. Sci. 2024, 14, 6079. https://doi.org/10.3390/app14146079
Zou C, Sun Y, Kong L. Unmanned Aerial Vehicle Landing on Rugged Terrain by On-Board LIDAR–Camera Positioning System. Applied Sciences. 2024; 14(14):6079. https://doi.org/10.3390/app14146079
Chicago/Turabian StyleZou, Cheng, Yezhen Sun, and Linghua Kong. 2024. "Unmanned Aerial Vehicle Landing on Rugged Terrain by On-Board LIDAR–Camera Positioning System" Applied Sciences 14, no. 14: 6079. https://doi.org/10.3390/app14146079