Practical Guidelines for Performing UAV Mapping Flights with Snapshot Sensors
Abstract
:1. Introduction
- O’Connor et al. [14] examined camera settings and their impact on image and orthomosaic quality, focusing on geosciences.
- Roth et al. [15] developed mapping software and provided a strong mathematical foundation.
- Assmann et al. [16] offered general flight guidelines, particularly for high latitudes and multispectral sensors.
- Tmusic et al. [17] presented general flight planning guidelines, including data processing and quality control, though with less focus on flight-specific details.
2. Sun-Sensor Geometry and BRDF
3. Flight Planning
- Is the primary goal to generate an accurate point cloud or DSM, or is it to produce an orthomosaic? What are the desired horizontal and vertical accuracy levels?
- What spatial resolution (ground sampling distance (GSD), see Section 3.2) is required?
- What spectral information and resolution are necessary?
- Area characteristics: What is the size, shape, and terrain (relief) of the area to be covered? Are there any legal restrictions or specific flight permits required? Where are potential take-off and landing points, and what obstacles might be present in the area?
- Timing: When can the flight be conducted? How much time is available for both the mapping flight and for generating the required outputs?
- Weather Conditions: What weather conditions are necessary for the mission’s objectives, what are the weather limitations for safe flights, and what is the actual weather forecast?
- What type of UAV(s) and sensor(s) are accessible, and what are their specifications and limitations?
- Which flight application will be used for planning and executing the mission?
3.1. Selection of Sensors and Lenses
- In general, ultra-wide focal length (<20 mm) should be avoided due to significant distortion issues [35].
- Wide lenses (20–40 mm) generally show superior photogrammetry results [35,36]. With terrestrial laser scans as reference, Denter et al. [35] compared various lenses for reconstructing a 3D forest scene and found that 21 mm and 35 mm lenses performed best, as they provided a better lateral view of tree crowns and trunks. Similar results were reported for thermal cameras [37]. On the other hand, the broad range of viewing angles captured within a single image can lead to bidirectional reflectance distribution function (BRDF) issues [27,28] (Section 2), requiring higher overlap (Section 3.3).
- Longer focal lengths (e.g., 50–100 mm) produced poorer photogrammetry results than wide-angle cameras in the mentioned study [35], but on the other hand, show less distortion and enable lower GSDs for resolutions in the sub-cm or sub-mm range (Section 3.2).
3.2. Ground Sampling Distance and Flight Height
3.2.1. GSD and Flight Height
3.2.2. Terrain Following
3.3. Overlap: Balancing Flight Time and Data Quality
3.4. Flight Speed
3.5. Flight Pattern and Flight Direction
3.5.1. Grid Flight Pattern
3.5.2. Flight Direction
3.6. Viewing Angle
3.7. Line of Sight Limitation: How Far Can You See a UAV?
4. Flight Execution: Ensuring Safe Flights at Best Quality
4.1. Weather Conditions and Their Impact on UAV Mapping Flights
4.1.1. Illumination
4.1.2. Wind Speed and Air Temperature
4.2. Time of the Flight
4.3. Ground Control Points and Precision GNSS Technology
- Required amount: A minimum of five GCPs is required for successful georeferencing [95,148]. For larger areas or areas with complex terrain, additional GCPs are needed [56,149,150], in particular to attain high vertical accuracy [71,151]. The optimal GCP density ultimately depends on the desired accuracy and the complexity of the relief [95]; in general, a meta-analysis showed that accuracy is improved until about 15 GCPs, after which more GCPs are only sensible in highly complex or very large areas [57].
- Optimal spatial distribution: The spatial arrangement of GCPs is as critical as their quantity [56,147,150,151]. They should cover the entire survey area, ideally distributed stratified or along the area edges, but with a few inside the area [151,152]. In addition, they should be best distributed across the different height classes in the terrain to be covered [71]. For a minimal setup of five GCPs, a quincunx (die-like) arrangement is recommended [147]. GCPs near edges should be positioned to ensure they are captured by multiple camera views, and GCPs should not be placed too close to each other, as this can complicate manual matching in SfM software, potentially degrading referencing accuracy.
4.4. Camera Set-Up and Camera Settings
4.5. Reference Measurements and Reference Targets
- Very dark panel: as dark as possible (ideally, about 1% reflectance): A very dark panel should be included as reflectance of vegetation and water in most visible spectra is very low (2–4%), and, ideally, the reference panel should have still lower reflectance.
- Medium dark panels: Dark grey (8–10%) and medium-grey (15–20%): Including panels of this range is important because of their relevance in the visual regions, but also because some multispectral cameras tend to saturate over brighter panels when positioned in an otherwise darker surroundings (such as vegetation or soil), particularly for the visible bands. Including this range of panels still facilitates the ELM method for all channels.
- Bright grey panel: 60–75% reflectance: To include brighter areas and, particularly, to correctly estimate reflectance of vegetation in the near-infrared.
- Camera accuracy correction: Microbolometer sensors have limited absolute accuracy, roughly ranging between ±5 to ±1 K. Cold and warm reference panels with known temperatures can be used to linearly correct the brightness temperature of the image [26,146,173,174], similar to the ELM of reflectance measurements. Typically, (ice-cold) water is used, or very bright (low temperature) and dark (high temperature) reference panels. Han et al. [175] constructed special temperature-controlled reference targets. Note that it is crucial to retrieve the exact emissivity of each panel and to correct for the emissivity during the workflow. On top of this, vignetting in thermal cameras can create temperature differences between the edges and the center of the image of up to several degrees [26]. The non-uniformity correction (NUC, Section 4.4) is for some models not sufficient [26], in which case the vignetting can be quantified by taking a thermal image of a uniform blackbody [26,176]. However, this is not absolutely required, provided that sufficiently high horizontal and vertical overlaps are foreseen.
- Atmospheric correction: Between the object and the camera, thermal radiation from the object is partially absorbed by the atmosphere, while atmospheric scattering contributes additional signals [34] (Figure 13); see Heinemann et al. [126] for an atmospheric correction protocol. They found that not correcting for atmospheric conditions leads to an error of 0.1–0.4 K. Atmospheric correction requires air temperature and relative humidity to be measured (Section 4.1).
- Correction for emissivity and incoming radiation: The thermal radiation leaving an object (Lout) is influenced by the emissivity ε, longwave incoming radiation Lin and surface temperature Ts, the variable of interest (Figure 13) [125]. Heinemann et al. [126] reported errors of 0.5–2.9 K, depending on the emissivity of the object, in sunny conditions when not correcting for emissivity and longwave incoming radiation. Both emissivity and longwave incoming radiation need to be known with sufficient accuracy. Emissivity is the most sensitive variable [125]. Unfortunately, it cannot be directly measured but needs to be estimated by either image segmentation and using look-up tables, or through an NDVI approach (see [126] for the protocol). Longwave incoming radiation Lin can be measured as the imaged temperature of a reference panel covered with crumpled aluminum foil [34,126]. This is economic and user-friendly and should always be included in thermal measurements.
- Normalization of atmospheric conditions: For research on drought stress or evapotranspiration of terrestrial ecosystems, surface temperature is usually expressed as a thermal index, similar to the vegetation indices for reflectance measurements [125]. The most common index, the crop water stress index CWSI [177,178], uses the lowest and highest possible temperature that the vegetation can attain in the given conditions. These temperatures should not be confounded with the low and high temperature panels for the thermal accuracy correction, since it is crucial that these panels correspond to temperatures of the actual vegetation [179,180]. A common reference target is to use a wet cloth as a cold reference temperature, as it transpires at a maximal rate and essentially provides the wet bulb temperature [146,181,182]. However, it does not accurately represent the canopy conditions [180,183]. Maes et al. [180] showed that artificial leaves made of cotton, remaining wet by constantly absorbing water from a reservoir, give a more precise estimate, but the scalability of this method to field level remains to be explored.
5. Discussion
5.1. Remaining Limitations and Knowledge Gaps
5.2. Towards a Harmonized Mapping Protocol?
5.3. Is There an Alternative for the Tedious Flight Mapping and Processing?
6. Conclusions
Supplementary Materials
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Statista. Drones—Worldwide. Available online: https://www.statista.com/outlook/cmo/consumer-electronics/drones/worldwide (accessed on 6 November 2024).
- Collier, P. Photogrammetry/Aerial Photography. In International Encyclopedia of Human Geography; Kitchin, R., Thrift, N., Eds.; Elsevier: Oxford, UK, 2009; pp. 151–156. [Google Scholar]
- Maes, W.H.; Steppe, K. Perspectives for remote sensing with Unmanned Aerial Vehicles in precision agriculture. Trends Plant Sci. 2019, 24, 152–164. [Google Scholar] [CrossRef] [PubMed]
- Tang, L.; Shao, G. Drone remote sensing for forestry research and practices. J. For. Res. 2015, 26, 791–797. [Google Scholar] [CrossRef]
- Koh, L.P.; Wich, S.A. Dawn of Drone Ecology: Low-Cost Autonomous Aerial Vehicles for Conservation. Trop. Conserv. Sci. 2012, 5, 121–132. [Google Scholar] [CrossRef]
- Sun, Z.; Wang, X.; Wang, Z.; Yang, L.; Xie, Y.; Huang, Y. UAVs as remote sensing platforms in plant ecology: Review of applications and challenges. J. Plant Ecol. 2021, 14, 1003–1023. [Google Scholar] [CrossRef]
- Mesas-Carrascosa, F.-J.; Notario García, M.D.; Meroño de Larriva, J.E.; García-Ferrer, A. An Analysis of the Influence of Flight Parameters in the Generation of Unmanned Aerial Vehicle (UAV) Orthomosaicks to Survey Archaeological Areas. Sensors 2016, 16, 1838. [Google Scholar] [CrossRef]
- Pepe, M.; Alfio, V.S.; Costantino, D. UAV Platforms and the SfM-MVS Approach in the 3D Surveys and Modelling: A Review in the Cultural Heritage Field. Appl. Sci. 2022, 12, 12886. [Google Scholar] [CrossRef]
- Bhardwaj, A.; Sam, L.; Akanksha; Martín-Torres, F.J.; Kumar, R. UAVs as remote sensing platform in glaciology: Present applications and future prospects. Remote Sens. Environ. 2016, 175, 196–204. [Google Scholar] [CrossRef]
- Park, S.; Choi, Y. Applications of Unmanned Aerial Vehicles in Mining from Exploration to Reclamation: A Review. Minerals 2020, 10, 663. [Google Scholar] [CrossRef]
- Śledź, S.; Ewertowski, M.W.; Piekarczyk, J. Applications of unmanned aerial vehicle (UAV) surveys and Structure from Motion photogrammetry in glacial and periglacial geomorphology. Geomorphology 2021, 378, 107620. [Google Scholar] [CrossRef]
- Jiang, S.; Jiang, C.; Jiang, W.S. Efficient structure from motion for large-scale UAV images: A review and a comparison of SfM tools. ISPRS J. Photogramm. Remote Sens. 2020, 167, 230–251. [Google Scholar] [CrossRef]
- Iglhaut, J.; Cabo, C.; Puliti, S.; Piermattei, L.; O’Connor, J.; Rosette, J. Structure from Motion Photogrammetry in Forestry: A Review. Curr. For. Rep. 2019, 5, 155–168. [Google Scholar] [CrossRef]
- O’Connor, J.; Smith, M.J.; James, M.R. Cameras and settings for aerial surveys in the geosciences: Optimising image data. Prog. Phys. Geogr. Earth Environ. 2017, 41, 325–344. [Google Scholar] [CrossRef]
- Roth, L.; Hund, A.; Aasen, H. PhenoFly Planning Tool: Flight planning for high-resolution optical remote sensing with unmanned areal systems. Plant Methods 2018, 14, 116. [Google Scholar] [CrossRef]
- Assmann, J.J.; Kerby, J.T.; Cunliffe, A.M.; Myers-Smith, I.H. Vegetation monitoring using multispectral sensors—Best practices and lessons learned from high latitudes. J. Unmanned Veh. Syst. 2019, 7, 54–75. [Google Scholar] [CrossRef]
- Tmušić, G.; Manfreda, S.; Aasen, H.; James, M.R.; Gonçalves, G.; Ben-Dor, E.; Brook, A.; Polinova, M.; Arranz, J.J.; Mészáros, J.; et al. Current Practices in UAS-based Environmental Monitoring. Remote Sens. 2020, 12, 1001. [Google Scholar] [CrossRef]
- Li, Z.; Roy, D.P.; Zhang, H.K. The incidence and magnitude of the hot-spot bidirectional reflectance distribution function (BRDF) signature in GOES-16 Advanced Baseline Imager (ABI) 10 and 15 min reflectance over north America. Remote Sens. Environ. 2021, 265, 112638. [Google Scholar] [CrossRef]
- Jafarbiglu, H.; Pourreza, A. Impact of sun-view geometry on canopy spectral reflectance variability. ISPRS J. Photogramm. Remote Sens. 2023, 196, 270–286. [Google Scholar] [CrossRef]
- Stow, D.; Nichol, C.J.; Wade, T.; Assmann, J.J.; Simpson, G.; Helfter, C. Illumination Geometry and Flying Height Influence Surface Reflectance and NDVI Derived from Multispectral UAS Imagery. Drones 2019, 3, 55. [Google Scholar] [CrossRef]
- Bovend’aerde, L. An Empirical BRDF Model for the Queensland Rainforests. Master’s Thesis, Ghent University, Ghent, Belgium, 2016. [Google Scholar]
- Bian, Z.; Roujean, J.-L.; Cao, B.; Du, Y.; Li, H.; Gamet, P.; Fang, J.; Xiao, Q.; Liu, Q. Modeling the directional anisotropy of fine-scale TIR emissions over tree and crop canopies based on UAV measurements. Remote Sens. Environ. 2021, 252, 112150. [Google Scholar] [CrossRef]
- Bian, Z.; Roujean, J.L.; Lagouarde, J.P.; Cao, B.; Li, H.; Du, Y.; Liu, Q.; Xiao, Q.; Liu, Q. A semi-empirical approach for modeling the vegetation thermal infrared directional anisotropy of canopies based on using vegetation indices. ISPRS J. Photogramm. Remote Sens. 2020, 160, 136–148. [Google Scholar] [CrossRef]
- Lagouarde, J.P.; Dayau, S.; Moreau, P.; Guyon, D. Directional Anisotropy of Brightness Surface Temperature Over Vineyards: Case Study Over the Medoc Region (SW France). IEEE Geosci. Remote Sens. Lett. 2014, 11, 574–578. [Google Scholar] [CrossRef]
- Jiang, L.; Zhan, W.; Tu, L.; Dong, P.; Wang, S.; Li, L.; Wang, C.; Wang, C. Diurnal variations in directional brightness temperature over urban areas through a multi-angle UAV experiment. Build. Environ. 2022, 222, 109408. [Google Scholar] [CrossRef]
- Kelly, J.; Kljun, N.; Olsson, P.-O.; Mihai, L.; Liljeblad, B.; Weslien, P.; Klemedtsson, L.; Eklundh, L. Challenges and Best Practices for Deriving Temperature Data from an Uncalibrated UAV Thermal Infrared Camera. Remote Sens. 2019, 11, 567. [Google Scholar] [CrossRef]
- Stark, B.; Zhao, T.; Chen, Y. An analysis of the effect of the bidirectional reflectance distribution function on remote sensing imagery accuracy from Small Unmanned Aircraft Systems. In Proceedings of the 2016 International Conference on Unmanned Aircraft Systems (ICUAS), Arlington, VA, USA, 7–10 June 2016; pp. 1342–1350. [Google Scholar]
- Honkavaara, E.; Khoramshahi, E. Radiometric Correction of Close-Range Spectral Image Blocks Captured Using an Unmanned Aerial Vehicle with a Radiometric Block Adjustment. Remote Sens. 2018, 10, 256. [Google Scholar] [CrossRef]
- Heim, R.H.J.; Okole, N.; Steppe, K.; Van Labeke, M.-C.; Geedicke, I.; Maes, W.H. An applied framework to unlocking multi-angular UAV reflectance data: A case study for classification of plant parameters in maize (Zea mays). Precis. Agric. 2024, 25, 1751–1775. [Google Scholar] [CrossRef]
- Aasen, H.; Honkavaara, E.; Lucieer, A.; Zarco-Tejada, P. Quantitative remote sensing at ultra-high resolution with uav spectroscopy: A review of sensor technology, measurement procedures, and data correction workflows. Remote Sens. 2018, 10, 1091. [Google Scholar] [CrossRef]
- Tu, Y.-H.; Phinn, S.; Johansen, K.; Robson, A. Assessing Radiometric Correction Approaches for Multi-Spectral UAS Imagery for Horticultural Applications. Remote Sens. 2018, 10, 1684. [Google Scholar] [CrossRef]
- Li, Y.Q.; Masschelein, B.; Vandebriel, R.; Vanmeerbeeck, G.; Luong, H.; Maes, W.; Van Beek, J.; Pauly, K.; Jayapala, M.; Charle, W.; et al. Compact VNIR snapshot multispectral airborne system and integration with drone system. In Proceedings of the Conference on Photonic Instrumentation Engineering IX Part of SPIE Photonics West OPTO Conference, San Francisco, CA, USA, 22 January–24 February 2022. [Google Scholar]
- Griffiths, D.; Burningham, H. Comparison of pre- and self-calibrated camera calibration models for UAS-derived nadir imagery for a SfM application. Prog. Phys. Geogr. Earth Environ. 2019, 43, 215–235. [Google Scholar] [CrossRef]
- Maes, W.; Huete, A.; Steppe, K. Optimizing the processing of UAV-based thermal imagery. Remote Sens. 2017, 9, 476. [Google Scholar] [CrossRef]
- Denter, M.; Frey, J.; Kattenborn, T.; Weinacker, H.; Seifert, T.; Koch, B. Assessment of camera focal length influence on canopy reconstruction quality. ISPRS Open J. Photogramm. Remote Sens. 2022, 6, 100025. [Google Scholar] [CrossRef]
- Kraus, K. Photogrammetry: Geometry from Images and Laser Scans; Walter de Gruyter: Berlin, Germany, 2011. [Google Scholar]
- Sangha, H.S.; Sharda, A.; Koch, L.; Prabhakar, P.; Wang, G. Impact of camera focal length and sUAS flying altitude on spatial crop canopy temperature evaluation. Comput. Electron. Agric. 2020, 172, 105344. [Google Scholar] [CrossRef]
- Zhu, Y.; Guo, Q.; Tang, Y.; Zhu, X.; He, Y.; Huang, H.; Luo, S. CFD simulation and measurement of the downwash airflow of a quadrotor plant protection UAV during operation. Comput. Electron. Agric. 2022, 201, 107286. [Google Scholar] [CrossRef]
- Van De Vijver, R.; Mertens, K.; Heungens, K.; Nuyttens, D.; Wieme, J.; Maes, W.H.; Van Beek, J.; Somers, B.; Saeys, W. Ultra-High-Resolution UAV-Based Detection of Alternaria solani Infections in Potato Fields. Remote Sens. 2022, 14, 6232. [Google Scholar] [CrossRef]
- Rasmussen, J.; Nielsen, J.; Streibig, J.C.; Jensen, J.E.; Pedersen, K.S.; Olsen, S.I. Pre-harvest weed mapping of Cirsium arvense in wheat and barley with off-the-shelf UAVs. Precis. Agric. 2019, 20, 983–999. [Google Scholar] [CrossRef]
- Gao, J.; Liao, W.; Nuyttens, D.; Lootens, P.; Vangeyte, J.; Pižurica, A.; He, Y.; Pieters, J.G. Fusion of pixel and object-based features for weed mapping using unmanned aerial vehicle imagery. Int. J. Appl. Earth Obs. Geoinf. 2018, 67, 43–53. [Google Scholar] [CrossRef]
- Barreto, A.; Lottes, P.; Ispizua Yamati, F.R.; Baumgarten, S.; Wolf, N.A.; Stachniss, C.; Mahlein, A.-K.; Paulus, S. Automatic UAV-based counting of seedlings in sugar-beet field and extension to maize and strawberry. Comput. Electron. Agric. 2021, 191, 106493. [Google Scholar] [CrossRef]
- García-Martínez, H.; Flores-Magdaleno, H.; Khalil-Gardezi, A.; Ascencio-Hernández, R.; Tijerina-Chávez, L.; Vázquez-Peña, M.A.; Mancilla-Villa, O.R. Digital Count of Corn Plants Using Images Taken by Unmanned Aerial Vehicles and Cross Correlation of Templates. Agronomy 2020, 10, 469. [Google Scholar] [CrossRef]
- Petti, D.; Li, C.Y. Weakly-supervised learning to automatically count cotton flowers from aerial imagery. Comput. Electron. Agric. 2022, 194, 106734. [Google Scholar] [CrossRef]
- Xu, X.; Li, H.; Yin, F.; Xi, L.; Qiao, H.; Ma, Z.; Shen, S.; Jiang, B.; Ma, X. Wheat ear counting using K-means clustering segmentation and convolutional neural network. Plant Methods 2020, 16, 106. [Google Scholar] [CrossRef]
- Fernandez-Gallego, J.A.; Lootens, P.; Borra-Serrano, I.; Derycke, V.; Haesaert, G.; Roldán-Ruiz, I.; Araus, J.L.; Kefauver, S.C. Automatic wheat ear counting using machine learning based on RGB UAV imagery. Plant J. 2020, 103, 1603–1613. [Google Scholar] [CrossRef]
- Wieme, J.; Leroux, S.; Cool, S.R.; Van Beek, J.; Pieters, J.G.; Maes, W.H. Ultra-high-resolution UAV-imaging and supervised deep learning for accurate detection of Alternaria solani in potato fields. Front. Plant Sci. 2024, 15, 1206998. [Google Scholar] [CrossRef] [PubMed]
- Kontogiannis, S.; Konstantinidou, M.; Tsioukas, V.; Pikridas, C. A Cloud-Based Deep Learning Framework for Downy Mildew Detection in Viticulture Using Real-Time Image Acquisition from Embedded Devices and Drones. Information 2024, 15, 178. [Google Scholar] [CrossRef]
- Carl, C.; Landgraf, D.; Van der Maaten-Theunissen, M.; Biber, P.; Pretzsch, H. Robinia pseudoacacia L. Flower Analyzed by Using An Unmanned Aerial Vehicle (UAV). Remote Sens. 2017, 9, 1091. [Google Scholar] [CrossRef]
- Gallmann, J.; Schüpbach, B.; Jacot, K.; Albrecht, M.; Winizki, J.; Kirchgessner, N.; Aasen, H. Flower Mapping in Grasslands With Drones and Deep Learning. Front. Plant Sci. 2022, 12, 774965. [Google Scholar] [CrossRef]
- Gröschler, K.-C.; Muhuri, A.; Roy, S.K.; Oppelt, N. Monitoring the Population Development of Indicator Plants in High Nature Value Grassland Using Machine Learning and Drone Data. Drones 2023, 7, 644. [Google Scholar] [CrossRef]
- Pu, R. Mapping Tree Species Using Advanced Remote Sensing Technologies: A State-of-the-Art Review and Perspective. J. Remote Sens. 2021, 2021, 9812624. [Google Scholar] [CrossRef]
- Seifert, E.; Seifert, S.; Vogt, H.; Drew, D.; van Aardt, J.; Kunneke, A.; Seifert, T. Influence of Drone Altitude, Image Overlap, and Optical Sensor Resolution on Multi-View Reconstruction of Forest Images. Remote Sens. 2019, 11, 1252. [Google Scholar] [CrossRef]
- Tu, Y.-H.; Phinn, S.; Johansen, K.; Robson, A.; Wu, D. Optimising drone flight planning for measuring horticultural tree crop structure. ISPRS J. Photogramm. Remote Sens. 2020, 160, 83–96. [Google Scholar] [CrossRef]
- Leitão, J.P.; Moy de Vitry, M.; Scheidegger, A.; Rieckermann, J. Assessing the quality of digital elevation models obtained from mini unmanned aerial vehicles for overland flow modelling in urban areas. Hydrol. Earth Syst. Sci. 2016, 20, 1637–1653. [Google Scholar] [CrossRef]
- Lee, S.; Park, J.; Choi, E.; Kim, D. Factors Influencing the Accuracy of Shallow Snow Depth Measured Using UAV-Based Photogrammetry. Remote Sens. 2021, 13, 828. [Google Scholar] [CrossRef]
- Deliry, S.I.; Avdan, U. Accuracy of Unmanned Aerial Systems Photogrammetry and Structure from Motion in Surveying and Mapping: A Review. J. Indian Soc. Remote Sens. 2021, 49, 1997–2017. [Google Scholar] [CrossRef]
- Yurtseven, H. Comparison of GNSS-, TLS- and Different Altitude UAV-Generated Datasets on the Basis of Spatial Differences. ISPRS Int. J. Geo-Inf. 2019, 8, 175. [Google Scholar] [CrossRef]
- Jay, S.; Baret, F.; Dutartre, D.; Malatesta, G.; Héno, S.; Comar, A.; Weiss, M.; Maupas, F. Exploiting the centimeter resolution of UAV multispectral imagery to improve remote-sensing estimates of canopy structure and biochemistry in sugar beet crops. Remote Sens. Environ. 2018, 231, 110898. [Google Scholar] [CrossRef]
- Zhu, W.; Rezaei, E.E.; Nouri, H.; Sun, Z.; Li, J.; Yu, D.; Siebert, S. UAV Flight Height Impacts on Wheat Biomass Estimation via Machine and Deep Learning. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2023, 16, 7471–7485. [Google Scholar] [CrossRef]
- Avtar, R.; Suab, S.A.; Syukur, M.S.; Korom, A.; Umarhadi, D.A.; Yunus, A.P. Assessing the Influence of UAV Altitude on Extracted Biophysical Parameters of Young Oil Palm. Remote Sens. 2020, 12, 3030. [Google Scholar] [CrossRef]
- Johansen, K.; Raharjo, T.; McCabe, M.F. Using Multi-Spectral UAV Imagery to Extract Tree Crop Structural Properties and Assess Pruning Effects. Remote Sens. 2018, 10, 854. [Google Scholar] [CrossRef]
- Yin, Q.; Zhang, Y.; Li, W.; Wang, J.; Wang, W.; Ahmad, I.; Zhou, G.; Huo, Z. Estimation of Winter Wheat SPAD Values Based on UAV Multispectral Remote Sensing. Remote Sens. 2023, 15, 3595. [Google Scholar] [CrossRef]
- Singh, C.H.; Mishra, V.; Jain, K. High-resolution mapping of forested hills using real-time UAV terrain following. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2023, X-1/W1-2023, 665–671. [Google Scholar] [CrossRef]
- Agüera-Vega, F.; Ferrer-González, E.; Martínez-Carricondo, P.; Sánchez-Hermosilla, J.; Carvajal-Ramírez, F. Influence of the Inclusion of Off-Nadir Images on UAV-Photogrammetry Projects from Nadir Images and AGL (Above Ground Level) or AMSL (Above Mean Sea Level) Flights. Drones 2024, 8, 662. [Google Scholar] [CrossRef]
- Zhao, H.; Zhang, B.; Hu, W.; Liu, J.; Li, D.; Liu, Y.; Yang, H.; Pan, J.; Xu, L. Adaptable Flight Line Planning for Airborne Photogrammetry Using DEM. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 6206–6218. [Google Scholar] [CrossRef]
- Kozmus Trajkovski, K.; Grigillo, D.; Petrovič, D. Optimization of UAV Flight Missions in Steep Terrain. Remote Sens. 2020, 12, 1293. [Google Scholar] [CrossRef]
- Smith, M.W.; Carrivick, J.L.; Quincey, D.J. Structure from motion photogrammetry in physical geography. Prog. Phys. Geogr. Earth Environ. 2016, 40, 247–275. [Google Scholar] [CrossRef]
- Domingo, D.; Ørka, H.O.; Næsset, E.; Kachamba, D.; Gobakken, T. Effects of UAV Image Resolution, Camera Type, and Image Overlap on Accuracy of Biomass Predictions in a Tropical Woodland. Remote Sens. 2019, 11, 948. [Google Scholar] [CrossRef]
- Lopes Bento, N.; Araújo E Silva Ferraz, G.; Alexandre Pena Barata, R.; Santos Santana, L.; Diennevan Souza Barbosa, B.; Conti, L.; Becciolini, V.; Rossi, G. Overlap influence in images obtained by an unmanned aerial vehicle on a digital terrain model of altimetric precision. Eur. J. Remote Sens. 2022, 55, 263–276. [Google Scholar] [CrossRef]
- Gonçalves, G.; Gonçalves, D.; Gómez-Gutiérrez, Á.; Andriolo, U.; Pérez-Alvárez, J.A. 3D Reconstruction of Coastal Cliffs from Fixed-Wing and Multi-Rotor UAS: Impact of SfM-MVS Processing Parameters, Image Redundancy and Acquisition Geometry. Remote Sens. 2021, 13, 1222. [Google Scholar] [CrossRef]
- Jiménez-Jiménez, S.I.; Ojeda-Bustamante, W.; Marcial-Pablo, M.d.J.; Enciso, J. Digital Terrain Models Generated with Low-Cost UAV Photogrammetry: Methodology and Accuracy. ISPRS Int. J. Geo-Inf. 2021, 10, 285. [Google Scholar] [CrossRef]
- Flores-de-Santiago, F.; Valderrama-Landeros, L.; Rodríguez-Sobreyra, R.; Flores-Verdugo, F. Assessing the effect of flight altitude and overlap on orthoimage generation for UAV estimates of coastal wetlands. J. Coast. Conserv. 2020, 24, 35. [Google Scholar] [CrossRef]
- Chaudhry, M.H.; Ahmad, A.; Gulzar, Q. Impact of UAV Surveying Parameters on Mixed Urban Landuse Surface Modelling. ISPRS Int. J. Geo-Inf. 2020, 9, 656. [Google Scholar] [CrossRef]
- Dandois, J.P.; Olano, M.; Ellis, E.C. Optimal Altitude, Overlap, and Weather Conditions for Computer Vision UAV Estimates of Forest Structure. Remote Sens. 2015, 7, 13895–13920. [Google Scholar] [CrossRef]
- Torres-Sánchez, J.; López-Granados, F.; Borra-Serrano, I.; Peña, J.M.J.P.A. Assessing UAV-collected image overlap influence on computation time and digital surface model accuracy in olive orchards. Precis. Agric. 2018, 19, 115–133. [Google Scholar] [CrossRef]
- Frey, J.; Kovach, K.; Stemmler, S.; Koch, B. UAV Photogrammetry of Forests as a Vulnerable Process. A Sensitivity Analysis for a Structure from Motion RGB-Image Pipeline. Remote Sens. 2018, 10, 912. [Google Scholar] [CrossRef]
- Malbéteau, Y.; Johansen, K.; Aragon, B.; Al-Mashhawari, S.K.; McCabe, M.F. Overcoming the Challenges of Thermal Infrared Orthomosaics Using a Swath-Based Approach to Correct for Dynamic Temperature and Wind Effects. Remote Sens. 2021, 13, 3255. [Google Scholar] [CrossRef]
- Olbrycht, R.; Więcek, B. New approach to thermal drift correction in microbolometer thermal cameras. Quant. InfraRed Thermogr. J. 2015, 12, 184–195. [Google Scholar] [CrossRef]
- Boesch, R. Thermal remote sensing with UAV-based workflows. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, XLII-2/W6, 41–46. [Google Scholar] [CrossRef]
- Sieberth, T.; Wackrow, R.; Chandler, J.H. Motion blur disturbs—The influence of motion-blurred images in photogrammetry. Photogramm. Rec. 2014, 29, 434–453. [Google Scholar] [CrossRef]
- Lee, K.; Ban, Y.; Kim, C. Motion Blur Kernel Rendering Using an Inertial Sensor: Interpreting the Mechanism of a Thermal Detector. Sensors 2022, 22, 1893. [Google Scholar] [CrossRef]
- Ahmed, S.; El-Shazly, A.; Abed, F.; Ahmed, W. The Influence of Flight Direction and Camera Orientation on the Quality Products of UAV-Based SfM-Photogrammetry. Appl. Sci. 2022, 12, 10492. [Google Scholar] [CrossRef]
- Mora-Felix, Z.D.; Sanhouse-Garcia, A.J.; Bustos-Terrones, Y.A.; Loaiza, J.G.; Monjardin-Armenta, S.A.; Rangel-Peraza, J.G. Effect of photogrammetric RPAS flight parameters on plani-altimetric accuracy of DTM. Open Geosci. 2020, 12, 1017–1035. [Google Scholar] [CrossRef]
- Beigi, P.; Rajabi, M.S.; Aghakhani, S. An Overview of Drone Energy Consumption Factors and Models. In Handbook of Smart Energy Systems; Fathi, M., Zio, E., Pardalos, P.M., Eds.; Springer International Publishing: Cham, Switzerland, 2021; pp. 1–20. [Google Scholar]
- Daniels, L.; Eeckhout, E.; Wieme, J.; Dejaegher, Y.; Audenaert, K.; Maes, W.H. Identifying the Optimal Radiometric Calibration Method for UAV-Based Multispectral Imaging. Remote Sens. 2023, 15, 2909. [Google Scholar] [CrossRef]
- Fawcett, D.; Anderson, K. Investigating Impacts of Calibration Methodology and Irradiance Variations on Lightweight Drone-Based Sensor Derived Surface Reflectance Products; SPIE: Bellingham, WA, USA, 2019; Volume 11149. [Google Scholar]
- Olsson, P.-O.; Vivekar, A.; Adler, K.; Garcia Millan, V.E.; Koc, A.; Alamrani, M.; Eklundh, L. Radiometric Correction of Multispectral UAS Images: Evaluating the Accuracy of the Parrot Sequoia Camera and Sunshine Sensor. Remote Sens. 2021, 13, 577. [Google Scholar] [CrossRef]
- Zhu, H.; Huang, Y.; An, Z.; Zhang, H.; Han, Y.; Zhao, Z.; Li, F.; Zhang, C.; Hou, C. Assessing radiometric calibration methods for multispectral UAV imagery and the influence of illumination, flight altitude and flight time on reflectance, vegetation index and inversion of winter wheat AGB and LAI. Comput. Electron. Agric. 2024, 219, 108821. [Google Scholar] [CrossRef]
- Mobley, C.D. Estimation of the remote-sensing reflectance from above-surface measurements. Appl. Opt. 1999, 38, 7442–7455. [Google Scholar] [CrossRef] [PubMed]
- Bi, R.; Gan, S.; Yuan, X.; Li, R.; Gao, S.; Luo, W.; Hu, L. Studies on Three-Dimensional (3D) Accuracy Optimization and Repeatability of UAV in Complex Pit-Rim Landforms As Assisted by Oblique Imaging and RTK Positioning. Sensors 2021, 21, 8109. [Google Scholar] [CrossRef] [PubMed]
- Nesbit, P.R.; Hugenholtz, C.H. Enhancing UAV–SfM 3D Model Accuracy in High-Relief Landscapes by Incorporating Oblique Images. Remote Sens. 2019, 11, 239. [Google Scholar] [CrossRef]
- Jiang, S.; Jiang, W.; Huang, W.; Yang, L. UAV-based oblique photogrammetry for outdoor data acquisition and offsite visual inspection of transmission line. Remote Sens. 2017, 9, 278. [Google Scholar] [CrossRef]
- Lin, Y.; Jiang, M.; Yao, Y.; Zhang, L.; Lin, J. Use of UAV oblique imaging for the detection of individual trees in residential environments. Urban For. Urban Green. 2015, 14, 404–412. [Google Scholar] [CrossRef]
- Dai, W.; Zheng, G.; Antoniazza, G.; Zhao, F.; Chen, K.; Lu, W.; Lane, S.N. Improving UAV-SfM photogrammetry for modelling high-relief terrain: Image collection strategies and ground control quantity. Earth Surf. Process. Landf. 2023, 48, 2884–2899. [Google Scholar] [CrossRef]
- Mueller, M.M.; Dietenberger, S.; Nestler, M.; Hese, S.; Ziemer, J.; Bachmann, F.; Leiber, J.; Dubois, C.; Thiel, C. Novel UAV Flight Designs for Accuracy Optimization of Structure from Motion Data Products. Remote Sens. 2023, 15, 4308. [Google Scholar] [CrossRef]
- Sadeq, H.A. Accuracy assessment using different UAV image overlaps. J. Unmanned Veh. Syst. 2019, 7, 175–193. [Google Scholar] [CrossRef]
- Rossi, P.; Mancini, F.; Dubbini, M.; Mazzone, F.; Capra, A. Combining nadir and oblique UAV imagery to reconstruct quarry topography: Methodology and feasibility analysis. Eur. J. Remote Sens. 2017, 50, 211–221. [Google Scholar] [CrossRef]
- James, M.R.; Robson, S.; d’Oleire-Oltmanns, S.; Niethammer, U. Optimising UAV topographic surveys processed with structure-from-motion: Ground control quality, quantity and bundle adjustment. Geomorphology 2017, 280, 51–66. [Google Scholar] [CrossRef]
- Meinen, B.U.; Robinson, D.T. Mapping erosion and deposition in an agricultural landscape: Optimization of UAV image acquisition schemes for SfM-MVS. Remote Sens. Environ. 2020, 239, 111666. [Google Scholar] [CrossRef]
- Li, L.; Mu, X.; Qi, J.; Pisek, J.; Roosjen, P.; Yan, G.; Huang, H.; Liu, S.; Baret, F. Characterizing reflectance anisotropy of background soil in open-canopy plantations using UAV-based multiangular images. ISPRS J. Photogramm. Remote Sens. 2021, 177, 263–278. [Google Scholar] [CrossRef]
- Deng, L.; Chen, Y.; Zhao, Y.; Zhu, L.; Gong, H.-L.; Guo, L.-J.; Zou, H.-Y. An approach for reflectance anisotropy retrieval from UAV-based oblique photogrammetry hyperspectral imagery. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102442. [Google Scholar] [CrossRef]
- Aasen, H.; Burkart, A.; Bolten, A.; Bareth, G. Generating 3D hyperspectral information with lightweight UAV snapshot cameras for vegetation monitoring: From camera calibration to quality assurance. ISPRS J. Photogramm. Remote Sens. 2015, 108, 245–259. [Google Scholar] [CrossRef]
- Burkart, A.; Aasen, H.; Alonso, L.; Menz, G.; Bareth, G.; Rascher, U. Angular dependency of hyperspectral measurements over wheat characterized by a novel UAV based goniometer. Remote Sens. 2015, 7, 725–746. [Google Scholar] [CrossRef]
- Roosjen, P.P.J.; Brede, B.; Suomalainen, J.M.; Bartholomeus, H.M.; Kooistra, L.; Clevers, J.G.P.W. Improved estimation of leaf area index and leaf chlorophyll content of a potato crop using multi-angle spectral data—Potential of unmanned aerial vehicle imagery. Int. J. Appl. Earth Obs. Geoinf. 2018, 66, 14–26. [Google Scholar] [CrossRef]
- Li, K.W.; Jia, H.; Peng, L.; Gan, L. Line-of-sight in operating a small unmanned aerial vehicle: How far can a quadcopter fly in line-of-sight? Appl. Ergon. 2019, 81, 102898. [Google Scholar] [CrossRef]
- Li, K.W.; Sun, C.; Li, N. Distance and Visual Angle of Line-of-Sight of a Small Drone. Appl. Sci. 2020, 10, 5501. [Google Scholar] [CrossRef]
- EASA. Guidelines for UAS Operations in the Open and Specific Category—Ref to Regulation (EU) 2019/947; EASA: Cologne, Germany, 2024. [Google Scholar]
- Slade, G.; Anderson, K.; Graham, H.A.; Cunliffe, A.M. Repeated drone photogrammetry surveys demonstrate that reconstructed canopy heights are sensitive to wind speed but relatively insensitive to illumination conditions. Int. J. Remote Sens. 2024, 28, 24–41. [Google Scholar] [CrossRef]
- Denka Durgan, S.; Zhang, C.; Duecaster, A. Evaluation and enhancement of unmanned aircraft system photogrammetric data quality for coastal wetlands. GIScience Remote Sens. 2020, 57, 865–881. [Google Scholar] [CrossRef]
- Revuelto, J.; Alonso-Gonzalez, E.; Vidaller-Gayan, I.; Lacroix, E.; Izagirre, E.; Rodríguez-López, G.; López-Moreno, J.I. Intercomparison of UAV platforms for mapping snow depth distribution in complex alpine terrain. Cold Reg. Sci. Technol. 2021, 190, 103344. [Google Scholar] [CrossRef]
- Harder, P.; Schirmer, M.; Pomeroy, J.; Helgason, W. Accuracy of snow depth estimation in mountain and prairie environments by an unmanned aerial vehicle. Cryosphere 2016, 10, 2559–2571. [Google Scholar] [CrossRef]
- Tetila, E.C.; Machado, B.B.; Astolfi, G.; Belete, N.A.d.S.; Amorim, W.P.; Roel, A.R.; Pistori, H. Detection and classification of soybean pests using deep learning with UAV images. Comput. Electron. Agric. 2020, 179, 105836. [Google Scholar] [CrossRef]
- Wang, S.; Baum, A.; Zarco-Tejada, P.J.; Dam-Hansen, C.; Thorseth, A.; Bauer-Gottwein, P.; Bandini, F.; Garcia, M. Unmanned Aerial System multispectral mapping for low and variable solar irradiance conditions: Potential of tensor decomposition. ISPRS J. Photogramm. Remote Sens. 2019, 155, 58–71. [Google Scholar] [CrossRef]
- Aasen, H.; Bolten, A. Multi-temporal high-resolution imaging spectroscopy with hyperspectral 2D imagers–From theory to application. Remote Sens. Environ. 2018, 205, 374–389. [Google Scholar] [CrossRef]
- Fawcett, D.; Bennie, J.; Anderson, K. Monitoring spring phenology of individual tree crowns using drone-acquired NDVI data. Remote Sens. Ecol. Conserv. 2021, 7, 227–244. [Google Scholar] [CrossRef]
- Cao, S.; Danielson, B.; Clare, S.; Koenig, S.; Campos-Vargas, C.; Sanchez-Azofeifa, A. Radiometric calibration assessments for UAS-borne multispectral cameras: Laboratory and field protocols. ISPRS J. Photogramm. Remote Sens. 2019, 149, 132–145. [Google Scholar] [CrossRef]
- Honkavaara, E.; Saari, H.; Kaivosoja, J.; Pölönen, I.; Hakala, T.; Litkey, P.; Mäkynen, J.; Pesonen, L. Processing and Assessment of Spectrometric, Stereoscopic Imagery Collected Using a Lightweight UAV Spectral Camera for Precision Agriculture. Remote Sens. 2013, 5, 5006–5039. [Google Scholar] [CrossRef]
- Miyoshi, G.T.; Imai, N.N.; Tommaselli, A.M.G.; Honkavaara, E.; Näsi, R.; Moriya, É.A.S. Radiometric block adjustment of hyperspectral image blocks in the Brazilian environment. Int. J. Remote Sens. 2018, 39, 4910–4930. [Google Scholar] [CrossRef]
- Wang, Y.; Yang, Z.; Khan, H.A.; Kootstra, G. Improving Radiometric Block Adjustment for UAV Multispectral Imagery under Variable Illumination Conditions. Remote Sens. 2024, 16, 3019. [Google Scholar] [CrossRef]
- Wang, Y.; Yang, Z.; Kootstra, G.; Khan, H.A. The impact of variable illumination on vegetation indices and evaluation of illumination correction methods on chlorophyll content estimation using UAV imagery. Plant Methods 2023, 19, 51. [Google Scholar] [CrossRef] [PubMed]
- Sun, B.; Li, Y.; Huang, J.; Cao, Z.; Peng, X. Impacts of Variable Illumination and Image Background on Rice LAI Estimation Based on UAV RGB-Derived Color Indices. Appl. Sci. 2024, 14, 3214. [Google Scholar] [CrossRef]
- Kizel, F.; Benediktsson, J.A.; Bruzzone, L.; Pedersen, G.B.M.; Vilmundardóttir, O.K.; Falco, N. Simultaneous and Constrained Calibration of Multiple Hyperspectral Images Through a New Generalized Empirical Line Model. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 2047–2058. [Google Scholar] [CrossRef]
- Qin, Z.; Li, X.; Gu, Y. An Illumination Estimation and Compensation Method for Radiometric Correction of UAV Multispectral Images. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–12. [Google Scholar] [CrossRef]
- Maes, W.H.; Steppe, K. Estimating evapotranspiration and drought stress with ground-based thermal remote sensing in agriculture: A review. J. Exp. Bot. 2012, 63, 4671–4712. [Google Scholar] [CrossRef]
- Heinemann, S.; Siegmann, B.; Thonfeld, F.; Muro, J.; Jedmowski, C.; Kemna, A.; Kraska, T.; Muller, O.; Schultz, J.; Udelhoven, T.; et al. Land Surface Temperature Retrieval for Agricultural Areas Using a Novel UAV Platform Equipped with a Thermal Infrared and Multispectral Sensor. Remote Sens. 2020, 12, 1075. [Google Scholar] [CrossRef]
- King, B.A.; Tarkalson, D.D.; Sharma, V.; Bjorneberg, D.L. Thermal Crop Water Stress Index Base Line Temperatures for Sugarbeet in Arid Western U.S. Agric. Water Manag. 2021, 243, 106459. [Google Scholar] [CrossRef]
- Ekinzog, E.K.; Schlerf, M.; Kraft, M.; Werner, F.; Riedel, A.; Rock, G.; Mallick, K. Revisiting crop water stress index based on potato field experiments in Northern Germany. Agric. Water Manag. 2022, 269, 107664. [Google Scholar] [CrossRef]
- Cunliffe, A.M.; Anderson, K.; Boschetti, F.; Brazier, R.E.; Graham, H.A.; Myers-Smith, I.H.; Astor, T.; Boer, M.M.; Calvo, L.G.; Clark, P.E.; et al. Global application of an unoccupied aerial vehicle photogrammetry protocol for predicting aboveground biomass in non-forest ecosystems. Remote Sens. Ecol. Conserv. 2022, 8, 57–71. [Google Scholar] [CrossRef]
- Mount, R. Acquisition of through-water aerial survey images. Photogramm. Eng. Remote Sens. 2005, 71, 1407–1415. [Google Scholar] [CrossRef]
- De Keukelaere, L.; Moelans, R.; Knaeps, E.; Sterckx, S.; Reusen, I.; De Munck, D.; Simis, S.G.H.; Constantinescu, A.M.; Scrieciu, A.; Katsouras, G.; et al. Airborne Drones for Water Quality Mapping in Inland, Transitional and Coastal Waters—MapEO Water Data Processing and Validation. Remote Sens. 2023, 15, 1345. [Google Scholar] [CrossRef]
- Elfarkh, J.; Johansen, K.; Angulo, V.; Camargo, O.L.; McCabe, M.F. Quantifying Within-Flight Variation in Land Surface Temperature from a UAV-Based Thermal Infrared Camera. Drones 2023, 7, 617. [Google Scholar] [CrossRef]
- Jin, R.; Zhao, L.; Ren, P.; Wu, H.; Zhong, X.; Gao, M.; Nie, Z. An Enhanced Model for Obtaining At-Sensor Brightness Temperature for UAVs Incorporating Meteorological Features and Its Application in Urban Thermal Environment. Sustain. Cities Soc. 2024, 118, 105987. [Google Scholar] [CrossRef]
- Gao, J. Quantitative Remote Sensing: Fundamentals and Environmental Applications; CRC Press: Boca Raton, FL, USA, 2024. [Google Scholar]
- McCoy, R.M. Field Methods in Remote Sensing; Guilford Publications: New York, NY, USA, 2005. [Google Scholar]
- Román, A.; Heredia, S.; Windle, A.E.; Tovar-Sánchez, A.; Navarro, G. Enhancing Georeferencing and Mosaicking Techniques over Water Surfaces with High-Resolution Unmanned Aerial Vehicle (UAV) Imagery. Remote Sens. 2024, 16, 290. [Google Scholar] [CrossRef]
- Jiang, R.; Wang, P.; Xu, Y.; Zhou, Z.; Luo, X.; Lan, Y.; Zhao, G.; Sanchez-Azofeifa, A.; Laakso, K. Assessing the Operation Parameters of a Low-altitude UAV for the Collection of NDVI Values Over a Paddy Rice Field. Remote Sens. 2020, 12, 1850. [Google Scholar] [CrossRef]
- Pepe, M.; Fregonese, L.; Scaioni, M. Planning airborne photogrammetry and remote-sensing missions with modern platforms and sensors. Eur. J. Remote Sens. 2018, 51, 412–436. [Google Scholar] [CrossRef]
- García-Tejero, I.F.; Costa, J.M.; Egipto, R.; Durán-Zuazo, V.H.; Lima, R.S.N.; Lopes, C.M.; Chaves, M.M. Thermal data to monitor crop-water status in irrigated Mediterranean viticulture. Agric. Water Manag. 2016, 176, 80–90. [Google Scholar] [CrossRef]
- Pou, A.; Diago, M.P.; Medrano, H.; Baluja, J.; Tardaguila, J. Validation of thermal indices for water status identification in grapevine. Agric. Water Manag. 2014, 134, 60–72. [Google Scholar] [CrossRef]
- Mirka, B.; Stow, D.A.; Paulus, G.; Loerch, A.C.; Coulter, L.L.; An, L.; Lewison, R.L.; Pflüger, L.S. Evaluation of thermal infrared imaging from uninhabited aerial vehicles for arboreal wildlife surveillance. Environ. Monit. Assess. 2022, 194, 512. [Google Scholar] [CrossRef]
- Whitworth, A.; Pinto, C.; Ortiz, J.; Flatt, E.; Silman, M. Flight speed and time of day heavily influence rainforest canopy wildlife counts from drone-mounted thermal camera surveys. Biodivers. Conserv. 2022, 31, 3179–3195. [Google Scholar] [CrossRef]
- Sângeorzan, D.D.; Păcurar, F.; Reif, A.; Weinacker, H.; Rușdea, E.; Vaida, I.; Rotar, I. Detection and Quantification of Arnica montana L. Inflorescences in Grassland Ecosystems Using Convolutional Neural Networks and Drone-Based Remote Sensing. Remote Sens. 2024, 16, 2012. [Google Scholar] [CrossRef]
- Dering, G.M.; Micklethwaite, S.; Thiele, S.T.; Vollgger, S.A.; Cruden, A.R. Review of drones, photogrammetry and emerging sensor technology for the study of dykes: Best practises and future potential. J. Volcanol. Geotherm. Res. 2019, 373, 148–166. [Google Scholar] [CrossRef]
- Perich, G.; Hund, A.; Anderegg, J.; Roth, L.; Boer, M.P.; Walter, A.; Liebisch, F.; Aasen, H. Assessment of Multi-Image Unmanned Aerial Vehicle Based High-Throughput Field Phenotyping of Canopy Temperature. Front. Plant Sci. 2020, 11, 150. [Google Scholar] [CrossRef]
- Messina, G.; Modica, G. Applications of UAV Thermal Imagery in Precision Agriculture: State of the Art and Future Research Outlook. Remote Sens. 2020, 12, 1491. [Google Scholar] [CrossRef]
- Awasthi, B.; Karki, S.; Regmi, P.; Dhami, D.S.; Thapa, S.; Panday, U.S. Analyzing the Effect of Distribution Pattern and Number of GCPs on Overall Accuracy of UAV Photogrammetric Results. In Proceedings of UASG 2019; Springer: Cham, Switzerland, 2020; pp. 339–354. [Google Scholar]
- Stöcker, C.; Nex, F.; Koeva, M.; Gerke, M. High-Quality UAV-Based Orthophotos for Cadastral Mapping: Guidance for Optimal Flight Configurations. Remote Sens. 2020, 12, 3625. [Google Scholar] [CrossRef]
- Yu, J.J.; Kim, D.W.; Lee, E.J.; Son, S.W. Determining the Optimal Number of Ground Control Points for Varying Study Sites through Accuracy Evaluation of Unmanned Aerial System-Based 3D Point Clouds and Digital Surface Models. Drones 2020, 4, 49. [Google Scholar] [CrossRef]
- Gindraux, S.; Boesch, R.; Farinotti, D. Accuracy Assessment of Digital Surface Models from Unmanned Aerial Vehicles’ Imagery on Glaciers. Remote Sens. 2017, 9, 186. [Google Scholar] [CrossRef]
- Cabo, C.; Sanz-Ablanedo, E.; Roca-Pardiñas, J.; Ordóñez, C. Influence of the Number and Spatial Distribution of Ground Control Points in the Accuracy of UAV-SfM DEMs: An Approach Based on Generalized Additive Models. IEEE Trans. Geosci. Remote Sens. 2021, 59, 10618–10627. [Google Scholar] [CrossRef]
- Martínez-Carricondo, P.; Agüera-Vega, F.; Carvajal-Ramírez, F.; Mesas-Carrascosa, F.-J.; García-Ferrer, A.; Pérez-Porras, F.-J. Assessment of UAV-photogrammetric mapping accuracy based on variation of ground control points. Int. J. Appl. Earth Obs. Geoinf. 2018, 72, 1–10. [Google Scholar] [CrossRef]
- Forlani, G.; Dall’Asta, E.; Diotri, F.; Cella, U.M.d.; Roncella, R.; Santise, M. Quality Assessment of DSMs Produced from UAV Flights Georeferenced with On-Board RTK Positioning. Remote Sens. 2018, 10, 311. [Google Scholar] [CrossRef]
- Bolkas, D. Assessment of GCP Number and Separation Distance for Small UAS Surveys with and without GNSS-PPK Positioning. J. Surv. Eng. 2019, 145, 04019007. [Google Scholar] [CrossRef]
- Nota, E.W.; Nijland, W.; de Haas, T. Improving UAV-SfM time-series accuracy by co-alignment and contributions of ground control or RTK positioning. Int. J. Appl. Earth Obs. Geoinf. 2022, 109, 102772. [Google Scholar] [CrossRef]
- Hugenholtz, C.; Brown, O.; Walker, J.; Barchyn, T.; Nesbit, P.; Kucharczyk, M.; Myshak, S. Spatial Accuracy of UAV-Derived Orthoimagery and Topography: Comparing Photogrammetric Models Processed with Direct Geo-Referencing and Ground Control Points. GEOMATICA 2016, 70, 21–30. [Google Scholar] [CrossRef]
- Cledat, E.; Jospin, L.V.; Cucci, D.A.; Skaloud, J. Mapping quality prediction for RTK/PPK-equipped micro-drones operating in complex natural environment. ISPRS J. Photogramm. Remote Sens. 2020, 167, 24–38. [Google Scholar] [CrossRef]
- Salas López, R.; Terrones Murga, R.E.; Silva-López, J.O.; Rojas-Briceño, N.B.; Gómez Fernández, D.; Oliva-Cruz, M.; Taddia, Y. Accuracy Assessment of Direct Georeferencing for Photogrammetric Applications Based on UAS-GNSS for High Andean Urban Environments. Drones 2022, 6, 388. [Google Scholar] [CrossRef]
- Žabota, B.; Kobal, M. Accuracy Assessment of UAV-Photogrammetric-Derived Products Using PPK and GCPs in Challenging Terrains: In Search of Optimized Rockfall Mapping. Remote Sens. 2021, 13, 3812. [Google Scholar] [CrossRef]
- Famiglietti, N.A.; Cecere, G.; Grasso, C.; Memmolo, A.; Vicari, A. A Test on the Potential of a Low Cost Unmanned Aerial Vehicle RTK/PPK Solution for Precision Positioning. Sensors 2021, 21, 3882. [Google Scholar] [CrossRef]
- Štroner, M.; Urban, R.; Seidl, J.; Reindl, T.; Brouček, J. Photogrammetry Using UAV-Mounted GNSS RTK: Georeferencing Strategies without GCPs. Remote Sens. 2021, 13, 1336. [Google Scholar] [CrossRef]
- Eltner, A.; Kaiser, A.; Castillo, C.; Rock, G.; Neugirg, F.; Abellán, A. Image-based surface reconstruction in geomorphometry—Merits, limits and developments. Earth Surf. Dynam. 2016, 4, 359–389. [Google Scholar] [CrossRef]
- Berra, E.F.; Peppa, M.V. Advances and Challenges of UAV SFM MVS Photogrammetry and Remote Sensing: Short Review. In Proceedings of the 2020 IEEE Latin American GRSS & ISPRS Remote Sensing Conference (LAGIRS), Santiago, Chile, 22–26 March 2020; pp. 533–538. [Google Scholar]
- Bagnall, G.C.; Thomasson, J.A.; Yang, C.; Wang, T.; Han, X.; Sima, C.; Chang, A. Uncrewed aerial vehicle radiometric calibration: A comparison of autoexposure and fixed-exposure images. Plant Phenome J. 2023, 6, e20082. [Google Scholar] [CrossRef]
- Swaminathan, V.; Thomasson, J.A.; Hardin, R.G.; Rajan, N.; Raman, R. Selection of appropriate multispectral camera exposure settings and radiometric calibration methods for applications in phenotyping and precision agriculture. Plant Phenome J. 2024, 7, e70000. [Google Scholar] [CrossRef]
- Yuan, W.; Hua, W. A Case Study of Vignetting Nonuniformity in UAV-Based Uncooled Thermal Cameras. Drones 2022, 6, 394. [Google Scholar] [CrossRef]
- Berni, J.A.J.; Zarco-Tejada, P.J.; Suarez, L.; Fereres, E. Thermal and narrowband multispectral remote sensing for vegetation monitoring from an unmanned aerial vehicle. IEEE Trans. Geosci. Remote Sens. 2009, 47, 722–738. [Google Scholar] [CrossRef]
- Arroyo-Mora, J.P.; Kalacska, M.; Soffer, R.J.; Lucanus, O. Comparison of Calibration Panels from Field Spectroscopy and UAV Hyperspectral Imagery Acquired Under Diffuse Illumination. In Proceedings of the 2021 IEEE International Geoscience and Remote Sensing Symposium IGARSS, Brussels, Belgium, 11–16 July 2021; pp. 60–63. [Google Scholar]
- Wang, Y.; Kootstra, G.; Yang, Z.; Khan, H.A. UAV multispectral remote sensing for agriculture: A comparative study of radiometric correction methods under varying illumination conditions. Biosyst. Eng. 2024, 248, 240–254. [Google Scholar] [CrossRef]
- Cao, H.; Gu, X.; Sun, Y.; Gao, H.; Tao, Z.; Shi, S. Comparing, validating and improving the performance of reflectance obtention method for UAV-Remote sensing. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102391. [Google Scholar] [CrossRef]
- Vlaminck, M.; Heidbuchel, R.; Philips, W.; Luong, H. Region-Based CNN for Anomaly Detection in PV Power Plants Using Aerial Imagery. Sensors 2022, 22, 1244. [Google Scholar] [CrossRef]
- Quater, P.B.; Grimaccia, F.; Leva, S.; Mussetta, M.; Aghaei, M. Light Unmanned Aerial Vehicles (UAVs) for Cooperative Inspection of PV Plants. IEEE J. Photovolt. 2014, 4, 1107–1113. [Google Scholar] [CrossRef]
- Gadhwal, M.; Sharda, A.; Sangha, H.S.; Merwe, D.V.d. Spatial corn canopy temperature extraction: How focal length and sUAS flying altitude influence thermal infrared sensing accuracy. Comput. Electron. Agric. 2023, 209, 107812. [Google Scholar] [CrossRef]
- Gómez-Candón, D.; Virlet, N.; Labbé, S.; Jolivot, A.; Regnard, J.-L.J.P.A. Field phenotyping of water stress at tree scale by UAV-sensed imagery: New insights for thermal acquisition and calibration. Precis. Agric. 2016, 17, 786–800. [Google Scholar] [CrossRef]
- Han, X.; Thomasson, J.A.; Swaminathan, V.; Wang, T.; Siegfried, J.; Raman, R.; Rajan, N.; Neely, H. Field-Based Calibration of Unmanned Aerial Vehicle Thermal Infrared Imagery with Temperature-Controlled References. Sensors 2020, 20, 7098. [Google Scholar] [CrossRef] [PubMed]
- Aragon, B.; Johansen, K.; Parkes, S.; Malbeteau, Y.; Al-Mashharawi, S.; Al-Amoudi, T.; Andrade, C.F.; Turner, D.; Lucieer, A.; McCabe, M.F. A Calibration Procedure for Field and UAV-Based Uncooled Thermal Infrared Instruments. Sensors 2020, 20, 3316. [Google Scholar] [CrossRef] [PubMed]
- Idso, S.B.; Jackson, R.D.; Pinter, P.J.; Reginato, R.J.; Hatfield, J.L. Normalizing the stress-degree-day parameter for environmental variability. Agric. Meteorol. 1981, 24, 45–55. [Google Scholar] [CrossRef]
- Jackson, R.D.; Idso, S.B.; Reginato, R.J.; Pinter, P.J. Canopy temperature as a crop water-stress indicator. Water Resour. Res. 1981, 17, 1133–1138. [Google Scholar] [CrossRef]
- Maes, W.H.; Achten, W.M.J.; Reubens, B.; Muys, B. Monitoring stomatal conductance of Jatropha curcas seedlings under different levels of water shortage with infrared thermography. Agric. For. Meteorol. 2011, 151, 554–564. [Google Scholar] [CrossRef]
- Maes, W.H.; Baert, A.; Huete, A.R.; Minchin, P.E.H.; Snelgar, W.P.; Steppe, K. A new wet reference target method for continuous infrared thermography of vegetations. Agric. For. Meteorol. 2016, 226–227, 119–131. [Google Scholar] [CrossRef]
- Meron, M.; Tsipris, J.; Charitt, D. Remote mapping of crop water status to assess spatial variability of crop stress. In Precision Agriculture, Proceedings of the 4th European Conference on Precision Agriculture, Berlin, Germany, 15 June 2003; Stafford, J., Werner, A., Eds.; Academic Publishers: Wageningen, The Netherlands, 2003; pp. 405–410. [Google Scholar]
- Möller, M.; Alchanatis, V.; Cohen, Y.; Meron, M.; Tsipris, J.; Naor, A.; Ostrovsky, V.; Sprintsin, M.; Cohen, S. Use of thermal and visible imagery for estimating crop water status of irrigated grapevine. J. Exp. Bot. 2007, 58, 827–838. [Google Scholar] [CrossRef]
- Prashar, A.; Jones, H. Infra-Red Thermography as a High-Throughput Tool for Field Phenotyping. Agronomy 2014, 4, 397. [Google Scholar] [CrossRef]
- Ribeiro-Gomes, K.; Hernandez-Lopez, D.; Ortega, J.F.; Ballesteros, R.; Poblete, T.; Moreno, M.A. Uncooled thermal camera calibration and optimization of the photogrammetry process for UAV applications in agriculture. Sensors 2017, 17, 2173. [Google Scholar] [CrossRef]
- Maes, W.H.; Huete, A.; Avino, M.; Boer, M.; Dehaan, R.; Pendall, E.; Griebel, A.; Steppe, K. Can UAV-based infrared thermography be used to study plant-parasite interactions between mistletoe and eucalypt trees? Remote Sens. 2018, 10, 2062. [Google Scholar] [CrossRef]
- Deery, D.M.; Rebetzke, G.J.; Jimenez-Berni, J.A.; James, R.A.; Condon, A.G.; Bovill, W.D.; Hutchinson, P.; Scarrow, J.; Davy, R.; Furbank, R.T. Methodology for High-Throughput Field Phenotyping of Canopy Temperature Using Airborne Thermography. Front. Plant Sci. 2016, 7, 1808. [Google Scholar] [CrossRef] [PubMed]
- Dai, W.; Qiu, R.; Wang, B.; Lu, W.; Zheng, G.; Amankwah, S.O.Y.; Wang, G. Enhancing UAV-SfM Photogrammetry for Terrain Modeling from the Perspective of Spatial Structure of Errors. Remote Sens. 2023, 15, 4305. [Google Scholar] [CrossRef]
- Tang, Z.; Wang, M.; Schirrmann, M.; Dammer, K.-H.; Li, X.; Brueggeman, R.; Sankaran, S.; Carter, A.H.; Pumphrey, M.O.; Hu, Y.; et al. Affordable High Throughput Field Detection of Wheat Stripe Rust Using Deep Learning with Semi-Automated Image Labeling. Comput. Electron. Agric. 2023, 207, 107709. [Google Scholar] [CrossRef]
- Raymaekers, D.; Delalieux, S. UAV-Based Remote Sensing: Improve efficiency through sampling missions. In Proceedings of the UAV-Based Remote Sensing Methods for Monitoring Vegetation, Köln, Germany, 30 September–1 October 2024. [Google Scholar]
3D Model | Orthomosaic Generation | ||||
---|---|---|---|---|---|
Terrain | Canopy | RGB | Reflectance (Multi-/Hyperspectral) | Thermal | |
Overlap | >70 V, >50 H * | >80 V, >70 H ** | >60 V, >50 H | >80 V, >80 H | >80 V, >80 H |
Flight speed | Normal | Slow | Normal | Slow | |
Grid pattern? | Yes | No | No | No | |
Flight direction | Standard | Standard | Perpendicular to sun *** | Standard (?) | |
Viewing angle | Include oblique | Nadir | Nadir | Nadir |
Height (m) | Diagonal Size (m) | Maximum Distance (Equation (5)) | Maximum Distance (Equation (6)) | |
---|---|---|---|---|
DJI Mini4 | 0.064 | 0.213 | 56 | 90 |
DJI Mavic 3 | 0.107 | 0.381 | 94 | 145 |
DJI Phantom | 0.28 | 0.59 | 245 | 213 |
DJI M350 | 0.43 | 0.895 | 379 | 313 |
DJI M600 | 0.759 | 1.669 | 669 | 566 |
3D Model | Orthomosaic Generation | ||||
---|---|---|---|---|---|
Terrain | Canopy | RGB | Reflectance (Multi-/Hyperspectral) | Thermal | |
Illumination | Best overcast, sunny possible; preferably not variable * | Preferably not variable | Preferably sunny, but not required | Sunny | |
Wind speed? | Not relevant | Low | Best low, but can be higher | Best low, but can be higher | Low |
Time of flight | Less relevant; if sunny conditions, best around solar noon | Less relevant | Solar noon (but avoid hot spot) | Solar noon (but avoid hot spot) | |
GCP? | Yes | Yes | Yes | Yes | |
Reference targets | Not relevant | Grey panel(s) recommended | Single or multiple grey panels * | Aluminium foil-covered panel + temperature panels (+extreme temperature panels) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Maes, W.H. Practical Guidelines for Performing UAV Mapping Flights with Snapshot Sensors. Remote Sens. 2025, 17, 606. https://doi.org/10.3390/rs17040606
Maes WH. Practical Guidelines for Performing UAV Mapping Flights with Snapshot Sensors. Remote Sensing. 2025; 17(4):606. https://doi.org/10.3390/rs17040606
Chicago/Turabian StyleMaes, Wouter H. 2025. "Practical Guidelines for Performing UAV Mapping Flights with Snapshot Sensors" Remote Sensing 17, no. 4: 606. https://doi.org/10.3390/rs17040606
APA StyleMaes, W. H. (2025). Practical Guidelines for Performing UAV Mapping Flights with Snapshot Sensors. Remote Sensing, 17(4), 606. https://doi.org/10.3390/rs17040606