The DOM Generation and Precise Radiometric Calibration of a UAV-Mounted Miniature Snapshot Hyperspectral Imager
Abstract
:1. Introduction
2. Materials and Methods
2.1. Experimental Equipment
2.1.1. Hyperspectral Imaging System
2.1.2. Illumination Sources and Spectrometers
2.2. Experimental Field and Data Acquisition
2.3. Radiometric Response Linearity and Radiometric Response Variation
2.3.1. Radiometric Response Linearity
2.3.2. Radiometric Response Variation
2.4. Generation of the DOM
2.4.1. Preliminary Processing of POS Data and Hyperspectral Images
2.4.2. Generation of the DOM and CSM
2.5. Spectral Calibration and Radiometric Calibration
2.5.1. Spectral Calibration
2.5.2. Radiometric Calibration and Verification
3. Results
3.1. Linearity of Pixel Responses Test and RRV Correction
3.2. Generated DOM and CSM
3.2.1. Interpolated POS, Generated DOM and CSM
3.2.2. Precision Verification
3.3. Spectral Calibration
3.4. Radiometric Calibration Evaluation
4. Discussion
4.1. Spectral and Radiometric Calibration
4.2. Generated Hyperspectral DOM and DEM
4.3. Potential Advantages of Snapshot Hyperspectral Cameras Coupled with Other Imaging Sensors
5. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Mulla, D.J. Twenty five years of remote sensing in precision agriculture: Key advances and remaining knowledge gaps. Biosyst. Eng. 2013, 114, 358–371. [Google Scholar] [CrossRef]
- Schima, R.; Mollenhauer, H.; Grenzdörffer, G.; Merbach, I.; Lausch, A.; Dietrich, P.; Bumberger, J. Imagine all the plants: Evaluation of a light-field camera for on-site crop growth monitoring. Remote Sens. 2016, 8, 823. [Google Scholar] [CrossRef]
- Feng, W.; Zhang, H.-Y.; Zhang, Y.-S.; Qi, S.-L.; Heng, Y.-R.; Guo, B.-B.; Ma, D.-Y.; Guo, T.-C. Remote detection of canopy leaf nitrogen concentration in winter wheat by using water resistance vegetation indices from in-situ hyperspectral data. Field Crop. Res. 2016, 198, 238–246. [Google Scholar] [CrossRef]
- Conţiu, Ş.; Groza, A. Improving remote sensing crop classification by argumentation-based conflict resolution in ensemble learning. Expert Syst. Appl. 2016, 64, 269–286. [Google Scholar] [CrossRef]
- Rickard, L.; Basedow, R. HYDICE: An Airborne System for Hyperspectral Imaging; SPIE: Bellingham, WA, USA, 1993; pp. 173–179. [Google Scholar]
- Cocks, T.; Jenssen, R.; Stewart, A.; Wilson, I. The HyMapTM airborne hyperspectral sensor: The system, calibration and performance. In Proceedings of the 1st EARSeL Workshop on Imaging Spectroscopy, Zurich, Switzerland, 6–8 October 1998; pp. 37–42. [Google Scholar]
- Pengra, B.; Johnston, C.; Loveland, T. Mapping an invasive plant, Phragmites australis, in coastal wetlands using the EO-1 Hyperion hyperspectral sensor. Remote Sens. Environ. 2007, 108, 74–81. [Google Scholar] [CrossRef]
- Ezequiel, C.A.F.; Cua, M.; Libatique, N.C.; Tangonan, G.L.; Alampay, R.; Labuguen, R.T.; Favila, C.M.; Honrado, J.L.E.; Canos, V.; Devaney, C.; et al. UAV aerial imaging applications for post-disaster assessment, environmental management and infrastructure development. In Proceedings of the International Conference on Unmanned Aircraft Systems, Orlando, FL, USA, 27–30 May 2014; pp. 274–283. [Google Scholar]
- Nex, F.; Remondino, F. UAV for 3D mapping applications: A review. Appl. Geomat. 2014, 6, 1–15. [Google Scholar] [CrossRef]
- Candiago, S.; Remondino, F.; De Giglio, M.; Dubbini, M.; Gattelli, M. Evaluating multispectral images and vegetation indices for precision farming applications from UAV images. Remote Sens. 2015, 7, 4026–4047. [Google Scholar] [CrossRef]
- Pajares, G. Overview and current status of remote sensing applications based on Unmanned Aerial Vehicles (UAVs). Photogramm. Eng. Remote Sens. 2015, 81, 281–329. [Google Scholar] [CrossRef]
- Achille, C.; Adami, A.; Chiarini, S.; Cremonesi, S.; Fassi, F.; Fregonese, L.; Taffurelli, L. UAV-based photogrammetry and integrated technologies for architectural applications—Methodological strategies for the after-quake survey of vertical structures in mantua (Italy). Sensors 2015, 15, 15520–15539. [Google Scholar] [CrossRef] [PubMed]
- Erdelj, M.; Natalizio, E. UAV-assisted disaster management: Applications and open issues. In Proceedings of the 2016 International Conference on Computing, Networking and Communications (ICNC), Kauai, HI, USA, 15–18 February 2016; pp. 1–5. [Google Scholar]
- Zhang, C.; Kovacs, J.M. The application of small unmanned aerial systems for precision agriculture: A review. Precis. Agric. 2012, 13, 693–712. [Google Scholar] [CrossRef]
- Habib, A.; Han, Y.; Xiong, W.; He, F.; Zhang, Z.; Crawford, M. Automated ortho-rectification of UAV-based hyperspectral data over an agricultural field using frame RGB imagery. Remote Sens. 2016, 8, 796. [Google Scholar] [CrossRef]
- Atzberger, C. Advances in remote sensing of agriculture: Context description, existing operational monitoring systems and major information needs. Remote Sens. 2013, 5, 949–981. [Google Scholar] [CrossRef]
- Li, L.; Ren, T.; Ma, Y.; Wei, Q.; Wang, S.; Li, X.; Cong, R.; Liu, S.; Lu, J. Evaluating chlorophyll density in winter oilseed rape (Brassica napus L.) using canopy hyperspectral red-edge parameters. Comput. Electron. Agric. 2016, 126, 21–31. [Google Scholar] [CrossRef]
- Liu, X.-D.; Sun, Q.-H. Early assessment of the yield loss in rice due to the brown planthopper using a hyperspectral remote sensing method. Int. J. Pest Manag. 2016, 62, 205–213. [Google Scholar] [CrossRef]
- Chi, G.; Huang, B.; Shi, Y.; Chen, X.; Li, Q.; Zhu, J. Detecting ozone effects in four wheat cultivars using hyperspectral measurements under fully open-air field conditions. Remote Sens. Environ. 2016, 184, 329–336. [Google Scholar] [CrossRef]
- Wu, X.; Zhang, W.; Qiu, Z.; Cen, H.; He, Y. A novel method for detection of pieris rapae larvae on cabbage leaves using NIR hyperspectral imaging. Appl. Eng. Agric. 2016, 32, 311–316. [Google Scholar]
- Li, J.; Huang, W.; Tian, X.; Wang, C.; Fan, S.; Zhao, C. Fast detection and visualization of early decay in citrus using Vis-NIR hyperspectral imaging. Comput. Electron. Agric. 2016, 127, 582–592. [Google Scholar] [CrossRef]
- Senthilkumar, T.; Jayas, D.S.; White, N.D.G.; Fields, P.G.; Gräfenhan, T. Detection of fungal infection and Ochratoxin A contamination in stored barley using near-infrared hyperspectral imaging. Biosyst. Eng. 2016, 147, 162–173. [Google Scholar] [CrossRef]
- Kang, J.; Ryu, C.; Kim, S.; Kang, Y.; Sarkar, T.K. Estimating moisture content of cucumber seedling using hyperspectral imagery. J. Biosyst. Eng. 2016, 41, 273–280. [Google Scholar] [CrossRef]
- Qiong, W.; Cheng, W.; Jingjing, F.; Jianwei, J. Field monitoring of wheat seedling stage with hyperspectral imaging. Int. J. Agric. Biol. Eng. 2016, 9, 143–148. [Google Scholar]
- Smith, M.W.; Vericat, D. From experimental plots to experimental landscapes: Topography, erosion and deposition in sub-humid badlands from Structure-from-Motion photogrammetry. Earth Surf. Process. Landf. 2015, 40, 1656–1671. [Google Scholar] [CrossRef]
- Woodget, A.S.; Carbonneau, P.E.; Visser, F.; Maddock, I.P. Quantifying submerged fluvial topography using hyperspatial resolution UAS imagery and structure from motion photogrammetry. Earth Surf. Process. Landf. 2015, 40, 47–64. [Google Scholar] [CrossRef]
- Lowe, D.G. Object recognition from local scale-invariant features. In Proceedings of the International Conference on Computer Vision, Kerkyra, Greece, 20–27 September 1999; pp. 1150–1157. [Google Scholar]
- Chen, B.; Chen, Z.; Deng, L.; Duan, Y.; Zhou, J. Building change detection with RGB-D map generated from UAV images. Neurocomputing 2016, 208, 305–364. [Google Scholar] [CrossRef]
- Moriondo, M.; Leolini, L.; Staglianò, N.; Argenti, G.; Trombi, G.; Brilli, L.; Dibari, C.; Leolini, C.; Bindi, M. Use of digital images to disclose canopy architecture in olive tree. Sci. Hortic. 2016, 209, 1–13. [Google Scholar] [CrossRef]
- Pan, H.; Guan, T.; Luo, Y.; Duan, L.; Tian, Y.; Yi, L.; Zhao, Y.; Yu, J. Dense 3D reconstruction combining depth and RGB information. Neurocomputing 2016, 175, 644–651. [Google Scholar] [CrossRef]
- Ishiguro, S.; Yamano, H.; Oguma, H. Evaluation of DSMs generated from multi-temporal aerial photographs using emerging structure from motion-multi-view stereo technology. Geomorphology 2016, 268, 64–71. [Google Scholar] [CrossRef]
- Dikovski, B.; Lameski, P.; Zdravevski, E.; Kulakov, A. Structure from Motion Obtained from Low Quality Images in Indoor Environment. Available online: http://s3.amazonaws.com/academia.edu.documents/39380974/Structure_from_motion_obtained_from_low_20151023-8465-1tcc1i9.pdf?AWSAccessKeyId=AKIAIWOWYYGZ2Y53UL3A&Expires=1497839798&Signature=AAENrosObAzpDoNcVUebsdTEKzE%3D&response-content-disposition=inline%3B%20filename%3DStructure_from_motion_obtained_from_low.pdf (accessed on 19 June 2017).
- Lucieer, A.; de Jong, S.M.; Turner, D. Mapping landslide displacements using Structure from Motion (SfM) and image correlation of multi-temporal UAV photography. Prog. Phys. Geogr. 2014, 38, 97–116. [Google Scholar] [CrossRef]
- Turner, D.; Lucieer, A.; Watson, C. An automated technique for generating georectified mosaics from ultra-high resolution Unmanned Aerial Vehicle (UAV) imagery, based on Structure from Motion (SFM) point clouds. Remote Sens. 2012, 4, 1392–1410. [Google Scholar] [CrossRef]
- Cunliffe, A.M.; Brazier, R.E.; Anderson, K. Ultra-fine grain landscape-scale quantification of dryland vegetation structure with drone-acquired structure-from-motion photogrammetry. Remote Sens. Environ. 2016, 183, 129–143. [Google Scholar] [CrossRef]
- Geipel, J.; Link, J.; Claupein, W. Combined spectral and spatial modeling of corn yield based on aerial images and crop surface models acquired with an unmanned aircraft system. Remote Sens. 2014, 6, 10335–10355. [Google Scholar] [CrossRef]
- Liu, Y.; Wang, T.; Ma, L.; Wang, N. Spectral calibration of hyperspectral data observed from a hyperspectrometer loaded on an unmanned aerial vehicle platform. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 2630–2638. [Google Scholar]
- Mahiny, A.S.; Turner, B.J. A comparison of four common atmospheric correction methods. Photogramm. Eng. Remote Sens. 2007, 73, 361–368. [Google Scholar] [CrossRef]
- Aasen, H.; Burkart, A.; Bolten, A.; Bareth, G. Generating 3D hyperspectral information with lightweight UAV snapshot cameras for vegetation monitoring: From camera calibration to quality assurance. ISPRS J. Photogramm. Remote Sens. 2015, 108, 245–259. [Google Scholar] [CrossRef]
- Honkavaara, E.; Arbiol, R.; Markelin, L.; Martinez, L.; Cramer, M.; Bovet, S.; Chandelier, L.; Ilves, R.; Klonus, S.; Marshal, P.; et al. Digital airborne photogrammetry—A new tool for quantitative remote sensing?—A state-of-the-art review on radiometric aspects of digital photogrammetric images. Remote Sens. 2009, 1, 577–605. [Google Scholar] [CrossRef]
- Honkavaara, E.; Hakala, T.; Markelin, L.; Rosnell, T.; Saari, H.; Makynen, J. A process for radiometric correction of UAV image blocks. Photogramm. Fernerkund. Geoinf. 2012, 2012, 115–127. [Google Scholar] [CrossRef]
- Lucieer, A.; Malenovský, Z.; Veness, T.; Wallace, L. HyperUAS-imaging spectroscopy from a multirotor unmanned aircraft system. J. Field Robot. 2014, 31, 571–590. [Google Scholar] [CrossRef]
- Karpel, N.; Schechner, Y.Y. Portable polarimetric underwater imaging system with a linear response. SPIE Def. Secur. 2004, 5432, 106–115. [Google Scholar]
- Zheng, Y.; Yu, J.; Bing Kang, S.; Lin, S.; Kambhamettu, C. Single-image vignetting correction using radial gradient symmetry. In Proceedings of the IEEE International Conference on Computer Vision, Anchorage, AK, USA, 23–28 June 2008; pp. 1–8. [Google Scholar]
- Goldman, D.B.; Chen, J.-H. Vignette and exposure calibration and compensation. IEEE Trans. Pattern Anal. Mach. Intell. 2010, 32, 2276–2288. [Google Scholar] [CrossRef] [PubMed]
- Aasen, H.; Bendig, J.; Bolten, A.; Bennertz, S.; Willkomm, M.; Bareth, G. Introduction and preliminary results of a calibration for full-frame hyperspectral cameras to monitor agricultural crops with UAVs. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, 40, 1–8. [Google Scholar] [CrossRef]
- Barazzetti, L.; Remondino, F.; Scaioni, M.; Brumana, R. Fully Automatic UAV Image-Based Sensor Orientation. Available online: http://www.isprs.org/proceedings/XXXVIII/part1/12/12_02_Paper_75.pdf (accessed on 19 June 2017).
- Ehlers, M.; Klonus, S.; Johan Åstrand, P.; Rosso, P. Multi-sensor image fusion for pansharpening in remote sensing. Int. J. Image Data Fusion 2010, 1, 25–45. [Google Scholar] [CrossRef]
- Milton, E.J.; Choi, K. Estimating the spectral response function of the CASI-2. In Proceedings of the Annual Conference of the Remote Sensing and Photogrammetry Society, Aberdeen, UK, 7–10 September 2004; pp. 1–11. [Google Scholar]
- Harwin, S.; Lucieer, A. Assessing the accuracy of georeferenced point clouds produced via multi-view stereopsis from Unmanned Aerial Vehicle (UAV) imagery. Remote Sens. 2012, 4, 1573–1599. [Google Scholar] [CrossRef]
- Burkart, A.; Cogliati, S.; Schickling, A.; Rascher, U. A novel UAV-Based ultra-light weight spectrometer for field spectroscopy. IEEE Sens. J. 2014, 14, 63–67. [Google Scholar] [CrossRef]
- Chance, C.M.; Coops, N.C.; Plowright, A.A.; Tooke, T.R.; Christen, A.; Aven, N. Invasive shrub mapping in an urban environment from hyperspectral and LiDAR-derived attributes. Front. Plant Sci. 2016, 7, 1258. [Google Scholar] [CrossRef] [PubMed]
- Rischbeck, P.; Elsayed, S.; Mistele, B.; Barmeier, G.; Heil, K.; Schmidhalter, U. Data fusion of spectral, thermal and canopy height parameters for improved yield prediction of drought stressed spring barley. Eur. J. Agron. 2016, 78, 44–59. [Google Scholar] [CrossRef]
- Nasrabadi, N.M. Multisensor joint fusion and detection of mines using SAR and Hyperspectral. In Proceedings of the IEEE Sensors, Lecce, Italy, 26–29 October 2008; pp. 1056–1059. [Google Scholar]
- Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating biomass of barley using Crop Surface Models (CSMs) derived from UAV-based RGB imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef]
Specification | Value | Specification | Value |
---|---|---|---|
Wavelength range | 450–950 nm | Housing | 28 cm × 6.5 cm × 7 cm |
Sampling interval | 4 nm | Digitization | 12 bit |
Spectral resolution | 8 nm at 532 nm | Field angle | 19° |
Channels | 125 | Cube resolution | 1 megapixel |
Detector | Si CCD | Spectral throughput | 2500 spectra/cube |
Weight | 470 g | Power | DC 12 V, 15 W |
© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yang, G.; Li, C.; Wang, Y.; Yuan, H.; Feng, H.; Xu, B.; Yang, X. The DOM Generation and Precise Radiometric Calibration of a UAV-Mounted Miniature Snapshot Hyperspectral Imager. Remote Sens. 2017, 9, 642. https://doi.org/10.3390/rs9070642
Yang G, Li C, Wang Y, Yuan H, Feng H, Xu B, Yang X. The DOM Generation and Precise Radiometric Calibration of a UAV-Mounted Miniature Snapshot Hyperspectral Imager. Remote Sensing. 2017; 9(7):642. https://doi.org/10.3390/rs9070642
Chicago/Turabian StyleYang, Guijun, Changchun Li, Yanjie Wang, Huanhuan Yuan, Haikuan Feng, Bo Xu, and Xiaodong Yang. 2017. "The DOM Generation and Precise Radiometric Calibration of a UAV-Mounted Miniature Snapshot Hyperspectral Imager" Remote Sensing 9, no. 7: 642. https://doi.org/10.3390/rs9070642