Spatial-Spectral-Emissivity Land-Cover Classification Fusing Visible and Thermal Infrared Hyperspectral Imagery
Abstract
:1. Introduction
- (1)
- Multi-feature fusion framework for high-resolution visible imagery. In the proposed classification framework, features from the high-resolution visible imagery are extracted for different purposes. The spectral feature provides the fundamental information of the ground-object categories, and the visible difference vegetation index (VDVI) [18] is adopted for its effectiveness in green plant separation. In addition, texture features and object-based features are extracted to utilize the spatial correlation between neighboring pixels. A multi-feature fusion framework for the visible data is proposed to integrate the above features to form a spectral-spatial feature set, to realize the feature representation of the high-resolution visible imagery.
- (2)
- Emissivity retrieval from the thermal infrared hyperspectral imagery. The thermal infrared hyperspectral imagery is a potential data source for the identification of man-made objects. In the proposed approach, the emissivity information is retrieved using an automated atmospheric compensation and temperature-emissivity separation (TES) method, called the FLAASH-IR (Fast Line-of-sight Atmospheric Analysis of Spectral Hypercubes—Infrared) algorithm [19], and it is used to construct the spatial-spectral-emissivity feature set. The feature set is then input to the unary potential term in the SSECRF algorithm, leading to a better classification performance.
- (3)
- Spatial-spectral-emissivity land-cover classification based on conditional random fields (SSECRF). The CRF model can incorporate the spatial contextual information in both the labels and observed data. In the proposed SSECRF algorithm, the spatial-spectral feature set from the visible imagery and the emissivity from the thermal infrared hyperspectral imagery are integrated by constructing the energy function to carry out land-cover classification. The spatial-spectral feature and emissivity feature are fused in the unary potential by calculating the probability. The pairwise potential is modeled to consider the spatial correlation of the given imagery, and adjacent pixels can usually be assumed to be the same class. The pairwise potential aims to solve the misclassified categories by utilizing the shape, texture, and spectral features. The spatial, spectral, and emissivity information are fused efficiently in the SSECRF algorithm by the potential terms.
2. Materials
2.1. Study Site
2.2. Dataset Used
3. Spatial-Spectral-Emissivity Land-Cover Classification Based on Conditional Random Fields (SSECRF)
3.1. Background
3.1.1. Emissivity Retrieval from the Thermal Infrared Hyperspectral Imagery
3.1.2. Conditional Random Fields (CRF)
3.2. Methodology of SSECRF
3.2.1. Feature Extraction from the Thermal Infrared Hyperspectral Imagery
3.2.2. Multi-Feature Extraction from the Visible Imagery
3.2.3. Construction of the Spatial-Spectral-Emissivity (SSE) Feature Set
3.2.4. Classification Based on SSECRF
4. Experiments and Analysis
4.1. Experimental Description
4.2. Experimental Results and Analysis
4.2.1. Validity Analysis of the Multi-Feature Fusion Framework for the High-Resolution Visible Imagery
4.2.2. Validity Analysis for the SSE Feature Set
4.2.3. Validity Analysis of the SSECRF Algorithm
4.3. Sensitivity Analysis
5. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Cihlar, J. Land cover mapping of large areas from satellites: Status and research priorities. Int. J. Remote Sens. 2000, 21, 1093–1114. [Google Scholar] [CrossRef]
- Zhong, Y.; Cao, Q.; Zhao, J.; Ma, A.; Zhao, B.; Zhang, L. Optimal Decision Fusion for Urban Land-Use/Land-Cover Classification Based on Adaptive Differential Evolution Using Hyperspectral and LiDAR Data. Remote Sens. 2017, 9, 868. [Google Scholar] [CrossRef]
- Zhu, Q.; Zhong, Y.; Zhang, L.; Li, D. Scene Classification Based on Fully Sparse Semantic Topic Model. IEEE Trans. Geosci. Remote Sens. 2017, 55. [Google Scholar] [CrossRef]
- Haralick, R.M.; Shanmugam, K. Textural features for image classification. IEEE Trans. Syst. Man. Cybern. 1973, 6, 610–621. [Google Scholar] [CrossRef]
- Pesaresi, M.; Benediktsson, J.A. A new approach for the morphological segmentation of high-resolution satellite imagery. IEEE Trans. Geosci. Remote Sens. 2001, 39, 309–320. [Google Scholar] [CrossRef]
- Taubenböck, H.; Esch, T.; Wurm, M.; Roth, A.; Dech, S. Object-based feature extraction using high spatial resolution satellite data of urban areas. J. Spat. Sci. 2010, 55, 117–132. [Google Scholar] [CrossRef]
- Cheng, J.; Liang, S.; Wang, J.; Li, X. A stepwise refining algorithm of temperature and emissivity separation for hyperspectral thermal infrared data. IEEE Trans. Geosci. Remote Sens. 2010, 48, 1588–1597. [Google Scholar] [CrossRef]
- Riley, D.N.; Hecker, C.A. Mineral mapping with airborne hyperspectral thermal infrared remote sensing at Cuprite, Nevada, USA. In Thermal Infrared Remote Sensing: Sensors, Methods, Applications; Kuenzer, C., Dech, S., Eds.; Springer: Berlin, Germany, 2013; pp. 495–514. [Google Scholar]
- Fontanilles, G.; Briottet, X.; Fabre, S.; Lefebvre, S.; Vandenhaute, P.F. Aggregation process of optical properties and temperature over heterogeneous surfaces in infrared domain. Appl. Opt. 2010, 49, 4655–4669. [Google Scholar] [CrossRef] [PubMed]
- Liao, W.; Huang, X.; Van Coillie, F.; Gautama, S.; Pižurica, A.; Philips, W.; Liu, H.; Zhu, T.; Shimoni, M.; Moser, G. Processing of multiresolution thermal hyperspectral and digital color data: Outcome of the 2014 IEEE GRSS data fusion contest. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2015, 8, 2984–2996. [Google Scholar] [CrossRef]
- Michaelsen, E. Self-organizing maps for fusion of thermal hyperspectral-with high-resolution VIS-data. In Proceedings of the 8th IAPR Workshop on Pattern Recognition in Remote Sensing (PRRS 2014), Stockholm, Sweden, 24 August 2014; pp. 1–4. [Google Scholar]
- Hasani, H.; Samadzadegan, F. 3D object classification based on thermal and visible imagery in urban area. In Proceedings of the International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, Göttingen, Germany, 23–25 November 2015; Volume XL.1, pp. 287–291. [Google Scholar]
- Li, J.; Zhang, H.; Guo, M.; Zhang, L.; Shen, H.; Du, Q. Urban classification by the fusion of thermal infrared hyperspectral and visible data. Photogramm. Eng. Remote Sens. 2015, 81, 901–911. [Google Scholar] [CrossRef]
- Lu, X.; Zhang, J.; Li, T.; Zhang, G. Synergetic classification of long-wave infrared hyperspectral and visible images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2015, 8, 3546–3557. [Google Scholar] [CrossRef]
- Marwaha, R.; Kumar, A.; Kumar, A.S. Object-oriented and pixel-based classification approach for land cover using airborne long-wave infrared hyperspectral data. J. Appl. Remote Sens. 2015, 9, 095040. [Google Scholar] [CrossRef]
- Akbari, D.; Homayouni, S.; Safari, A.; Mehrshad, N. Mapping urban land cover based on spatial-spectral classification of hyperspectral remote-sensing data. Int. J. Remote Sens. 2016, 37, 440–454. [Google Scholar] [CrossRef]
- Samadzadegan, F.; Hasani, H.; Reinartz, P. Toward optimum fusion of thermal hyperspectral and visible images in classification of urban area. Photogramm. Eng. Remote Sens. 2017, 83, 269–280. [Google Scholar] [CrossRef]
- Wang, X.; Wang, M.; Wang, S.; Wu, Y. Extraction of vegetation information from visible unmanned aerial vehicle images. Trans. Chin. Soc. Agric. Eng. 2015, 31, 152–159. [Google Scholar]
- Adler-Golden, S.; Conforti, P.; Gagnon, M.; Tremblay, P.; Chamberland, M. Long-wave infrared surface reflectance spectra retrieved from Telops Hyper-Cam imagery. In Proceedings of the Algorithms and Technologies for Multispectral, Hyperspectral, and Ultraspectral Imagery XX, Baltimore, MD, USA, 5 May 2014; Volume 9088, p. 90880U. [Google Scholar]
- Sobrino, J.; Raissouni, N.; Li, Z.L. A comparative study of land surface emissivity retrieval from NOAA data. Remote Sens. Environ. 2001, 75, 256–266. [Google Scholar] [CrossRef]
- Gillespie, A.; Rokugawa, S.; Matsunaga, T.; Cothern, J.S.; Hook, S.; Kahle, A.B. A temperature and emissivity separation algorithm for advanced spaceborne thermal emission and reflection radiometer (ASTER) images. IEEE Trans. Geosci. Remote Sens. 1998, 36, 1113–1126. [Google Scholar] [CrossRef]
- Jin, M.; Liang, S. An improved land surface emissivity parameter for land surface models using global remote sensing observations. J. Clim. 2006, 19, 2867–2881. [Google Scholar] [CrossRef]
- Gagnon, M.A.; Tremblay, P.; Savary, S.; Duval, M.; Farley, V.; Lagueux, P.; Guyot, É.; Chamberland, M. Airborne thermal infrared hyperspectral imaging for mineral mapping. In Proceedings of the International Workshop on Advanced Infrared Technology & Applications, Pisa, Italy, 29 September–2 October 2015; pp. 83–86. [Google Scholar]
- Kastek, M.; Piątkowski, T.; Trzaskawka, P. Infrared imaging fourier transform spectrometer as the stand-off gas detection system. Metrol. Meas. Syst. 2011, 18, 607–620. [Google Scholar] [CrossRef]
- Neinavaz, E.; Darvishzadeh, R.; Skidmore, A.K.; Groen, T.A. Measuring the response of canopy emissivity spectra to leaf area index variation using thermal hyperspectral data. Int. J. Appl. Earth Obs. 2016, 53, 40–47. [Google Scholar] [CrossRef]
- Lafferty, J.; McCallum, A.; Pereira, F. Conditional random fields: Probabilistic models for segmenting and labeling sequence data. In Proceedings of the International Conference on Machine Learning (ICML), Williamstown, MA, USA, 28 June–1 July 2001; pp. 282–289. [Google Scholar]
- Kumar, S. Discriminative random fields: A discriminative framework for contextual interaction in classification. In Proceedings of the Ninth IEEE International Conference on Computer Vision, Nice, France, 13–16 October 2003; pp. 1150–1157. [Google Scholar]
- Zhong, P.; Wang, R. Jointly learning the hybrid CRF and MLR model for simultaneous denoising and classification of hyperspectral imagery. IEEE Trans. Neural Netw. Learn. Syst. 2014, 25, 1319–1334. [Google Scholar] [CrossRef]
- Zhong, P.; Wang, R. Learning conditional random fields for classification of hyperspectral images. IEEE Trans. Image Process. 2010, 19, 1890–1907. [Google Scholar] [CrossRef] [PubMed]
- Yang, M.Y.; Förstner, W. A hierarchical conditional random field model for labeling and classifying images of man-made scenes. In Proceedings of the IEEE International Conference on Computer Vision Workshops (ICCV Workshops), Barcelona, Spain, 6–13 November 2011; pp. 196–203. [Google Scholar]
- Wegner, J.D.; Hansch, R.; Thiele, A.; Soergel, U. Building detection from one orthophoto and high-resolution InSAR data using conditional random fields. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2011, 4, 83–91. [Google Scholar] [CrossRef]
- Lv, P.; Zhong, Y.; Zhao, J.; Jiao, H.; Zhang, L. Change detection based on a multifeature probabilistic ensemble conditional random field model for high spatial resolution remote sensing imagery. IEEE Geosci. Remote Sens. Lett. 2016, 13, 1965–1969. [Google Scholar] [CrossRef]
- Zhao, J.; Zhong, Y.; Shu, H.; Zhang, L. High-resolution image classification integrating spectral-spatial-location cues by conditional random fields. IEEE Trans. Image Process. 2016, 25, 4033–4045. [Google Scholar] [CrossRef] [PubMed]
- Kumar, S.; Hebert, M. Discriminative random fields. Int. J. Comput. Vis. 2006, 68, 179–201. [Google Scholar] [CrossRef]
- Kohli, P.; Torr, P.H. Robust higher order potentials for enforcing label consistency. Int. J. Comput. Vis. 2009, 82, 302–324. [Google Scholar] [CrossRef]
- Wang, C.; Wu, H.C.; Principe, J.C. Cost function for robust estimation of PCA. In Proceedings of the SPIE 2760, Applications and Science of Artificial Neural Networks II, Orlando, FL, USA, 8 April 1996; pp. 120–127. [Google Scholar]
- Green, A.; Craig, M.; Shi, C. The application of the minimum noise fraction transform to the compression and cleaning of hyper-spectral remote sensing data. In Proceedings of the IGARSS’88: Remote Sensing—Moving towards the 21st Century/International Geoscience and Remote Sensing Symposium, Edinburgh, UK, 12–16 September 1988; Volume 3, p. 1807. [Google Scholar]
- Murinto, K.; Nur, R.D.P. Feature reduction using minimum noise fraction and principal component analysis transforms for improving the classification of hyperspectral image. Asia-Pac. J. Sci. Technol. 2017, 22, 1–5. [Google Scholar]
- Shotton, J.; Winn, J.; Rother, C.; Criminisi, A. Textonboost: Joint appearance, shape and context modeling for multi-class object recognition and segmentation. In Proceedings of the 9th European Conference on Computer Vision, Graz, Austria, 7–13 May 2006; pp. 1–15. [Google Scholar]
- Vapnik, V.; Cortes, C. Support-vector networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar]
- Zhao, J.; Zhong, Y.; Zhang, L. Detail-preserving smoothing classifier based on conditional random fields for high spatial resolution remote sensing imagery. IEEE Trans. Geosci. Remote Sens. 2015, 53, 2440–2452. [Google Scholar] [CrossRef]
- Rother, C.; Kolmogorov, V.; Blake, A. Grabcut: Interactive foreground extraction using iterated graph cuts. ACM Trans. Graph. 2004, 23, 309–314. [Google Scholar] [CrossRef]
- Boykov, Y.; Veksler, O.; Zabih, R. Fast approximate energy minimization via graph cuts. IEEE Trans. Pattern Anal. 2001, 23, 1222–1239. [Google Scholar] [CrossRef]
- 2014 IEEE GRSS Data Fusion Classification Contest Results. Available online: http://www.grss-ieee.org/community/technical-committees/data-fusion/2014-ieee-grss-data-fusion-classification-contest-results (accessed on 30 August 2017).
Class | Training Samples | Test Samples |
---|---|---|
Road | 112,457 | 809,098 |
Tree | 27,700 | 100,749 |
Red roof | 46,578 | 136,697 |
Grey roof | 53,520 | 142,868 |
Concrete roof | 97,826 | 109,539 |
Glass | 185,329 | 103,583 |
Bare soil | 44,738 | 49,212 |
VIS 1 | VIS + VDVI 2 | VIS + Obj. 3 | VIS + Tex. 4 | VIS + Tex. + Obj. | VIS + Tex. + Obj. + VDVI | |
---|---|---|---|---|---|---|
OA | 82.3218 | 82.4446 | 81.8945 | 81.7489 | 84.9826 | 85.1477 |
Kappa | 0.7438 | 0.7452 | 0.7427 | 0.7401 | 0.7835 | 0.7857 |
Road | Trees | Red Roof | Grey Roof | Concrete Roof | Vegetation | Bare Soil | |
---|---|---|---|---|---|---|---|
Road | 92.83 | 0.18 | 0.04 | 1.84 | 0.79 | 1.33 | 0.63 |
Trees | 0.05 | 89.32 | 0.13 | 0 | 0.03 | 20.65 | 0 |
Red roof | 0.01 | 0.02 | 96.57 | 4.31 | 0.01 | 0.01 | 7.39 |
Grey roof | 5.24 | 0.31 | 2.77 | 91.02 | 3.55 | 0.06 | 0.25 |
Concrete roof | 1.84 | 0 | 0.22 | 2.84 | 95.62 | 0.01 | 0.13 |
Vegetation | 0 | 10.03 | 0.21 | 0 | 0 | 77.93 | 2.37 |
Bare soil | 0.03 | 0.14 | 0.07 | 0 | 0 | 0.01 | 89.22 |
Road | Trees | Red Roof | Grey Roof | Concrete Roof | Vegetation | Bare Soil | |
---|---|---|---|---|---|---|---|
Road | 92.9 | 0.19 | 0.03 | 2.26 | 0.72 | 1.16 | 0.5 |
Trees | 0.04 | 89.23 | 0.1 | 0 | 0.01 | 19.97 | 0 |
Red roof | 0.01 | 0.02 | 95.89 | 3.48 | 0.01 | 0.01 | 3.13 |
Grey roof | 5.11 | 0.13 | 2.85 | 90.6 | 4.83 | 0.05 | 0.3 |
Concrete roof | 1.83 | 0 | 0.11 | 3.65 | 94.37 | 0.01 | 0.39 |
Vegetation | 0.01 | 10.35 | 0.12 | 0 | 0 | 78.46 | 1.96 |
Bare soil | 0.1 | 0.07 | 0.91 | 0 | 0.05 | 0.35 | 93.71 |
Road | Trees | Red Roof | Grey Roof | Concrete Roof | Vegetation | Bare Soil | |
---|---|---|---|---|---|---|---|
Road | 94.58 | 0.01 | 0 | 2.30 | 0.25 | 0.94 | 0.80 |
Trees | 0 | 91.42 | 0.07 | 0 | 0 | 12.89 | 0 |
Red roof | 0 | 0 | 97.39 | 3.04 | 0 | 0 | 0.21 |
Grey roof | 4.80 | 0.03 | 2.44 | 94.66 | 4.19 | 0 | 0 |
Concrete roof | 0.62 | 0 | 0 | 0 | 95.56 | 0 | 0 |
Vegetation | 0 | 8.54 | 0 | 0 | 0 | 86.16 | 0.05 |
Bare soil | 0 | 0 | 0.20 | 0 | 0 | 0 | 98.93 |
© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhong, Y.; Jia, T.; Zhao, J.; Wang, X.; Jin, S. Spatial-Spectral-Emissivity Land-Cover Classification Fusing Visible and Thermal Infrared Hyperspectral Imagery. Remote Sens. 2017, 9, 910. https://doi.org/10.3390/rs9090910
Zhong Y, Jia T, Zhao J, Wang X, Jin S. Spatial-Spectral-Emissivity Land-Cover Classification Fusing Visible and Thermal Infrared Hyperspectral Imagery. Remote Sensing. 2017; 9(9):910. https://doi.org/10.3390/rs9090910
Chicago/Turabian StyleZhong, Yanfei, Tianyi Jia, Ji Zhao, Xinyu Wang, and Shuying Jin. 2017. "Spatial-Spectral-Emissivity Land-Cover Classification Fusing Visible and Thermal Infrared Hyperspectral Imagery" Remote Sensing 9, no. 9: 910. https://doi.org/10.3390/rs9090910