Spectral-Spatial Hyperspectral Image Classification With Edge-Preserving Filtering
Spectral-Spatial Hyperspectral Image Classification With Edge-Preserving Filtering
Spectral-Spatial Hyperspectral Image Classification With Edge-Preserving Filtering
5, MAY 2014
Abstract—The integration of spatial context in the classification images [6], e.g., random forests [7], [8], neural networks [9],
of hyperspectral images is known to be an effective way in improv- [10], AdaBoost [11], support vector machines (SVMs) [12],
ing classification accuracy. In this paper, a novel spectral–spatial sparse representation [13], [14], and active learning [15]–[17]
classification framework based on edge-preserving filtering is pro-
posed. The proposed framework consists of the following three methods. Among these methods, the SVM classifier has, in
steps. First, the hyperspectral image is classified using a pixelwise particular, shown a good performance in terms of classification
classifier, e.g., the support vector machine classifier. Then, the accuracy [12] since it has two major advantages: First, it
resulting classification map is represented as multiple probability requires relatively few training samples to obtain good classifi-
maps, and edge-preserving filtering is conducted on each proba- cation accuracies; second, it is robust to the spectral dimension
bility map, with the first principal component or the first three
principal components of the hyperspectral image serving as the of hyperspectral images [12].
gray or color guidance image. Finally, according to the filtered To improve the classification performance further, many re-
probability maps, the class of each pixel is selected based on the searchers have worked on spectral–spatial classification which
maximum probability. Experimental results demonstrate that the can incorporate the spatial contextual information into the
proposed edge-preserving filtering based classification method can pixelwise classifiers [18]. For example, extended morpho-
improve the classification accuracy significantly in a very short
time. Thus, it can be easily applied in real applications. logical profiles (EMPs) have been proposed for constructing
spectral–spatial features [19] which are adaptive definitions
Index Terms—Classification, edge-preserving filters (EPFs), of the neighborhood of pixels. Furthermore, spectral–spatial
hyperspectral data, spatial context.
kernels, e.g., composite [20], morphological [21], and graphic
[22] kernels, have also been proposed for the improvement
I. I NTRODUCTION
of the SVM classifier. The morphological and kernel-based
0196-2892 © 2013 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
KANG et al.: SPECTRAL–SPATIAL HYPERSPECTRAL IMAGE CLASSIFICATION 2667
O i = a j I i + bj , ∀i ∈ ωj . (2)
II. E DGE -P RESERVING F ILTERING
During the last decade, many different EPFs, e.g., the joint This model ensures ∇O ≈ a∇I, which means that the filtering
bilateral filter [33], the weighted least-squares (WLS) filter output O will have an edge only if the guidance image I has
[36], the guided filter [34], the domain transform filter [45], the an edge. To determine the coefficients aj and bj , an energy
local linear Stein’s unbiased risk estimate filter [46], and the function [see (3)] is constructed as follows:
L0 gradient filter [47], have been proposed. Most of these EPFs
have a similar property, i.e., they can be used for joint filtering, E(aj , bj ) = (aj Ii + bj − Pi )2 + a2j (3)
i∈ωj
where the content of one image is smoothed based on the edge
information from a guidance image. It means that the spatial where is a regularization parameter deciding the degree of
information of the guidance image is able to be considered blurring for the guided filter. The energy function is based on
in the filtering process. In this section, the two most widely two goals. First, the filtering output, i.e., (aj Ii + bj ), should
used EPFs, i.e., the joint bilateral filter and the guided filter, be as close as possible to the input image P . Second, the
are illustrated. local linear model should be maintained in the energy function.
By solving the energy function, abrupt intensity changes in
A. Joint Bilateral Filter the guidance image I can be mostly preserved in the filtering
The joint bilateral filter is based on the widely used Gaussian output O.
filter considering the distance in the image plane (the spatial do-
main) and the distance in the intensity axis (the range domain). C. Comparison of the Two EPFs
The spatial and range distances are defined using two Gaus-
sian decreasing functions, i.e., Gδs (i − j) = exp(−(i − Given the aforementioned simple description of the two
j)/δs2 ) and Gδr (|Ii − Ij |) = exp(−|Ii − Ij |2 /δr2 ). Specifi- EPFs, the properties of the EPFs are studied in Fig. 1. Fig. 1(a)
cally, the filtering output Oi of the input pixel Pi can be and (c) shows the input image P and the guidance image
represented as a weighted average of its neighborhood pixels I, respectively. Fig. 1(b) and (d) shows the filtering outputs
Pj as follows: obtained by different EPFs with different parameter settings.
Regarding the joint bilateral filter (see Fig. 1(b), from left
1
Oi = b Gδs (i − j) Gδr (|Ii − Ij |) Pj to right), the two parameters, i.e., δs and δr , are set as fol-
Ki j∈ω lows: δs = 2, δr = 0.01; δs = 2, δr = 0.1; δs = 4, δr = 0.1;
i
2668 IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, VOL. 52, NO. 5, MAY 2014
Fig. 1. (a) Input image. (b) and (d) Filtered images obtained by different EPFs with different parameter settings. (c) Guidance image.
A. Problem Formulation
The general supervised hyperspectral classification problem
can be formulated as follows: Let S ≡ {1, . . . , i} denote the
set of pixels of the hyperspectral image, x = (x1 , . . . , xi ) ∈
Rd×i denote an image of d-dimensional feature vectors,
L ≡ {1, . . . , n} be a set of labels, and c = (c1 , . . . , ci ) be
the classification map of labels. Given a training set Tτ ≡
{(x1 , c1 ), . . . , (xτ , cτ )} ∈ (Rd × L)τ , where τ is the total Fig. 3. Example of 1-D step edge. Here, μ and σ are shown for a filtering
number of training samples, the goal of classification is to kernel centered exactly at an edge.
obtain a classification map, i.e., c, which assigns a label ci ∈ L
to each pixel i ∈ S. and the filtering weight Wi,j (I) of the guided filter can
be expressed as follows:
B. Proposed Approach 1 (Ii − μk )(Ij − μk )
The proposed approach consists of three steps: 1) construc- Wi,j (I) = 1 + (6)
|ω|2 σk2 +
k∈ωi ,k∈ωj
tion of the initial probability maps; 2) filtering of the probability
maps; and 3) classification based on the maximum probability. where ωi and ωj are local windows around pixel i and j,
1) Construction of the Initial Probability Maps: It is known respectively, μk and σk are the mean and variance of I in
that an initial classification map c can be obtained by a pixel- ωk , and |ω| is the number of pixels in ωk . A 1-D step edge
wise classifier. In this paper, the pixelwise classification map c example is presented in Fig. 3 to demonstrate the edge-
is represented using probability maps, i.e., p = (p1 , . . . , pn ), preserving property of the filtering weight for the guided
in which pi,n ∈ [0, 1] is the initial probability that a pixel i filter. As shown in the figure, if Ii and Ij are on the same
belongs to the nth class. Specifically, the probability pi,n is side of an edge, the term (Ii − μk )(Ij − μk ) in (6) will
defined as follows:
have a positive sign. However, if Ij is located on the other
1 if ci = n side of the edge, the term will have a negative sign. Thus,
pi,n = (4)
0 otherwise. the filtering weight becomes larger for pixel pairs on the
The SVM classifier is adopted for pixelwise classification since same side of the edge but small otherwise. Hence, those
it is one of the most widely used pixelwise classifiers and probabilities on the same side of an edge in the guidance
has been successfully applied for many other spectral–spatial image I tend to have similar filtering outputs.
classification methods [6], [18], [48]. Regarding the choice of the guidance image I, principal com-
2) Filtering of the Probability Maps: Initially, spatial infor- ponent analysis (PCA) is adopted because it gives an optimal
mation is not considered. All probabilities are valued at either representation of the image in the mean squared sense. Here,
0 or 1. Therefore, the probability maps appear noisy and not two options are given as follows.
aligned with real object boundaries. To solve this problem,
the probability maps are optimized by edge-preserving filter- 1) Gray-scale guidance image: the PCA decomposition is
ing. Specifically, the optimized probabilities are modeled as a conducted on the original hyperspectral image, and the
weighted average of its neighborhood probabilities first principal component which contains most of the edge
information is adopted as the guidance image (see Fig. 2).
ṕi,n = Wi,j (I)pj,n (5) 2) Color guidance image: instead of guiding the filtering
j with a gray-scale image, the first three principal compo-
where i and j represent the ith and jth pixels and the filtering nents are used as the color guidance image of the EPFs
weight W is chosen such that the filter preserves edges of a (see Fig. 2).
specified guidance image I. Therefore, this step has two major Fig. 4 shows an example of probability filtering. It shows
problems: 1) how to choose an EPF and 2) how to choose a that the initial probabilities look noisy and are not aligned
guidance image. Different filters and guidance images produce with real object boundaries. Probability optimization with edge-
different filtering weights Wi,j for (5). preserving filtering has two major advantages in this example
For the choice of EPF, two widely used EPFs are adopted [see Fig. 4(b)]. First, noise probabilities that appear as scattered
in this paper. The weights Wi,j obtained by the two filters are points or lines can be effectively smoothed. Second, the refined
reviewed as follows. probabilities are always aligned with real object boundaries.
1) Filtering weight for the joint bilateral filter: the weight The two advantages demonstrate that the spatial contextual
of the joint bilateral filter information of the guidance image is well utilized in the edge-
is already presented in
(1), i.e., Wi,j (I) = 1/Kib j∈ωi Gδs (i − j)Gδr (|Ii − preserving filtering process.
Ij |). Based on the corresponding description in Section II, 3) Classification Based on the Maximum Probability: Ac-
it is easy to know that those adjacent input probabilities cording to (7), once the probability maps are filtered, the label
which have similar intensities or colors in the guidance at pixel i can be simply chosen in a maximization manner as
image will have similar outputs after filtering. follows:
2) Filtering weight for the guided filter: as described in [37],
(2) can be represented in a weighted average form as (5), ći = arg max ṕi,n . (7)
n
2670 IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, VOL. 52, NO. 5, MAY 2014
Fig. 4. (a) Initial probability maps for the University of Pavia data set. (b) Final probability maps obtained by edge-preserving filtering. The lower figures in
(a) and (b) show the close-up views of the probability maps denoted by boxes in the upper figures.
This step aims at transforming the probability maps ṕn into the
final classification result ć.
Fig. 7. (a) Three-band color composite of the Salinas image. (b) and
(c) Ground truth data of the Salinas image.
the number of training and test samples for each class is detailed
in Tables II–IV, respectively.
2) Quality Indexes: Three widely used quality indexes, i.e., Fig. 8. Indian Pines image: analysis of the influence of the parameters δs , δr ,
r, and when the first principal component is selected as the guidance image.
the overall accuracy (OA), the average accuracy (AA), and
the kappa coefficient, are adopted to evaluate the performance
of the proposed method. OA is the percentage of correctly
classified pixels. AA is the mean of the percentage of correctly
classified pixels for each class. The kappa coefficient gives the
percentage of correctly classified pixels corrected by the num-
ber of agreements that would be expected purely by chance.
B. Classification Results
1) Analysis of the Influence of Parameters: For the proposed
joint bilateral filtering-based technique, the parameters δs and
δr determine the filtering size and blur degree, respectively.
Similarly, the parameters r and denote the filtering size and
blur degree of the guided filter, respectively. The influence of
these parameters on the classification performance is analyzed
in Figs. 8 and 9 (experiment is performed on the Indian Pines
image). In the experiment, the training set which accounts for
10% of the ground truth was chosen randomly (see Table II).
The OA, AA, and kappa of the proposed method are measured
with different parameter settings. When the influence of δs is
analyzed, δr is fixed to be 0.2. Similarly, for the guided filter, Fig. 9. Indian Pines image: analysis of the influence of the parameters δs , δr ,
r, and when the color composite of the first three principal components is
when the influence of r is analyzed, is fixed to be 0.01. selected as the guidance image.
Furthermore, δr and can be analyzed in the same way with
δs and r fixed at 3.
When the first principal component is used as the guidance limited local spatial information is considered in the filtering
image for the EPFs, it can be seen from Fig. 8 that, if the process.
filtering size and blur degree, i.e., δs , δr , r, and , are too large, Furthermore, Fig. 9 shows the influence of the parameters
the average classification accuracy may decrease dramatically. when the color composite of the first three principal compo-
The reason is that a large filtering size and blur degree may nents serves as the guidance image of the EPFs. From this
over-smooth the probability maps, and thus, those small-scale figure, a similar conclusion can be obtained that the filtering
objects may be misclassified. For example, although the OA size and blur degree cannot be too small or large. In this paper,
obtained with δs = 4 is similar to the accuracy obtained with the default parameter setting of the proposed method is given
δs = 5 [see Fig. 8(a)], the AA decreases dramatically when as follows: When the guidance image of the EPF is a gray-scale
δs = 5 because one small-scale class which contains 20 pixels image, δs = 3, δr = 0.2, r = 3, and = 0.01 are set to be the
is totally misclassified when the filtering size is too large. default parameters; when the guidance image of the EPF is a
Similarly, a very small filtering size or blur degree is also not color image, δs = 4, δr = 0.2, r = 4, and = 0.01 are set to
good for the proposed method because it means that only very be the default parameters. In the following experiments, it is
2672 IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, VOL. 52, NO. 5, MAY 2014
TABLE I
C LASSIFICATION ACCURACY ( IN P ERCENT ) OF THE P ROPOSED
M ETHOD W ITH D IFFERENT P OSTPROCESSING T ECHNIQUES . T HE
S TATISTICS -BASED M ETHOD I S A PPLIED W ITHIN A 7 × 7 W INDOW.
T HE PARAMETERS OF THE WLS F ILTER A RE S ET TO BE α = 1.4
AND λ = 0.3. T HE PARAMETERS OF THE NC F ILTER A RE S ET
TO BE δs = 3 AND δr = 0.2
1 http://www.lx.it.pt/~jun/ 2 http://xudongkang.weebly.com
KANG et al.: SPECTRAL–SPATIAL HYPERSPECTRAL IMAGE CLASSIFICATION 2673
TABLE II
N UMBER OF T RAINING (T RAIN ) AND T EST (T EST ) S AMPLES OF THE I NDIAN P INES I MAGE AND C LASSIFICATION ACCURACIES ( IN P ERCENT )
FOR THE SVM [12], EMP [25], AEAP[49], L-MLL [17], EPF-B-g, EPF-B-c, EPF-G-g, AND EPF-G-c M ETHODS
TABLE III
N UMBER OF T RAINING (T RAIN ) AND T EST (T EST ) S AMPLES OF THE U NIVERSITY OF PAVIA I MAGE AND C LASSIFICATION ACCURACIES ( IN P ERCENT )
FOR THE SVM [12], EMP [25], AEAP[49], L-MLL [17], EPF-B-g, EPF-B-c, EPF-G-g, AND EPF-G-c M ETHODS
TABLE IV
N UMBER OF T RAINING (T RAIN ) AND T EST (T EST ) S AMPLES OF THE S ALINAS I MAGE AND C LASSIFICATION ACCURACIES ( IN P ERCENT ) FOR THE SVM
[12], EMP [25], AEAP [49], L-MLL [17], EPF-B-g, EPF-B-c, EPF-G-g, AND EPF-G-c M ETHODS
TABLE V ACKNOWLEDGMENT
C OMPUTING T IME ( IN S ECONDS ) OF THE P ROPOSED A LGORITHMS .
T HE N UMBERS O UTSIDE AND I NSIDE THE PARENTHESES The authors would like to thank the Editor-in-Chief, the
S HOW THE C OMPUTING T IMES OF THE MATLAB AND
C++ I MPLEMENTATIONS , R ESPECTIVELY anonymous Associate Editor, and the reviewers for their
insightful comments and suggestions which have greatly im-
proved this paper, P. Ghamisi and N. Falco for their contri-
butions, and M. Pedergnana and Dr. J. Li for providing the
software of the AEAP and the L-MLL methods.
[20] G. Camps-Valls, L. Gomez-Chova, J. Munoz-Mari, J. Vila-Frances, and [44] K. Kotwal and S. Chaudhuri, “Visualization of hyperspectral images using
J. Calpe-Maravilla, “Composite kernels for hyperspectral image classi- bilateral filtering,” IEEE Trans. Geosci. Remote Sens., vol. 48, no. 5,
fication,” IEEE Geosci. Remote Sens. Lett., vol. 3, no. 1, pp. 93–97, pp. 2308–2316, May 2010.
Jan. 2006. [45] E. S. L. Gastal and M. M. Oliveira, “Domain transform for edge-aware im-
[21] M. Fauvel, J. Chanussot, and J. A. Benediktsson, “A spatial–spectral age and video processing,” ACM Trans. Graph., vol. 30, no. 4, pp. 69:1–
kernel-based approach for the classification of remote-sensing images,” 69:12, Jul. 2011.
Pattern Recognit., vol. 45, no. 1, pp. 381–392, Jan. 2012. [46] T. Qiu, A. Wang, N. Yu, and A. Song, “LLSURE: Local linear sure-based
[22] G. Camps-Valls, N. Shervashidze, and K. M. Borgwardt, “Spatio-spectral edge-preserving image filtering,” IEEE Trans. Image Process., vol. 22,
remote sensing image classification with graph kernels,” IEEE Geosci. no. 1, pp. 80–90, Jan. 2013.
Remote Sens. Lett., vol. 7, no. 4, pp. 741–745, Oct. 2010. [47] L. Xu, C. Lu, Y. Xu, and J. Jia, “Image smoothing via L0 gradient
[23] A. Plaza, P. Martinez, J. Plaza, and R. Perez, “Dimensionality reduc- minimization,” ACM Trans. Graph., vol. 30, no. 6, pp. 174:1–174:12,
tion and classification of hyperspectral image data using sequences of Dec. 2011.
extended morphological transformations,” IEEE Trans. Geosci. Remote [48] X. Huang and L. Zhang, “An SVM ensemble approach combining spec-
Sens., vol. 43, no. 3, pp. 466–479, Mar. 2005. tral, structural, and semantic features for the classification of high-
[24] D. M. Mura, A. Villa, J. A. Benediktsson, J. Chanussot, and L. Bruzzone, resolution remotely sensed imagery,” IEEE Trans. Geosci. Remote Sens.,
“Classification of hyperspectral images by using extended morphological vol. 51, no. 1, pp. 257–272, Jan. 2013.
attribute profiles and independent component analysis,” IEEE Geosci. [49] P. R. Marpu, M. Pedergnana, M. D. Mura, J. A. Benediktsson, and
Remote Sens. Lett., vol. 8, no. 3, pp. 542–546, May 2011. L. Bruzzone, “Automatic generation of standard deviation attribute pro-
[25] J. A. Benediktsson, J. A. Palmason, and J. R. Sveinsson, “Classification files for spectral–spatial classification of remote sensing data,” IEEE
of hyperspectral data from urban areas based on extended morphological Geosci. Remote Sens. Lett., vol. 10, no. 2, pp. 293–297, Mar. 2013.
profiles,” IEEE Trans. Geosci. Remote Sens., vol. 43, no. 3, pp. 480–491, [50] R. C. Gonzalez and R. E. Woods, Digital Image Processing, 2nd ed.
Mar. 2005. Boston, MA, USA: Addison-Wesley, 2001.
[26] Y. Tarabalka, M. Fauvel, J. Chanussot, and J. A. Benediktsson, “SVM- and [51] C. C. Chang and C. J. Lin (2011, Apr.). LIBSVM: A library for sup-
MRF-based method for accurate classification of hyperspectral images,” port vector machines. ACM Trans. Intell. Syst. Technol. [Online]. 2(3),
IEEE Geosci. Remote Sens. Lett., vol. 7, no. 4, pp. 736–740, Oct. 2010. pp. 27:1–27:27. Available: http://www.csie.ntu.edu.tw/~cjlin/libsvm
[27] G. Moser and S. B. Serpico, “Combining support vector machines and
Markov random fields in an integrated framework for contextual im-
age classification,” IEEE Trans. Geosci. Remote Sens., vol. 51, no. 5,
pp. 2734–2752, May 2013.
[28] Y. Tarabalka, J. A. Benediktsson, and J. Chanussot, “Spectral–spatial
classification of hyperspectral imagery based on partitional clustering
techniques,” IEEE Trans. Geosci. Remote Sens., vol. 47, no. 8, pp. 2973–
2987, Aug. 2009.
[29] Y. Tarabalka, J. Chanussot, and J. A. Benediktsson, “Segmentation and
classification of hyperspectral images using watershed transformation,”
Pattern Recognit., vol. 43, no. 7, pp. 2367–2379, Jul. 2010.
[30] Y. Tarabalka, J. A. Benediktsson, J. Chanussot, and J. C. Tilton, “Mul-
tiple spectral–spatial classification approach for hyperspectral data,” Xudong Kang (S’13) received the B.Sc. degree from
IEEE Trans. Geosci. Remote Sens., vol. 48, no. 11, pp. 4122–4132, Northeast University, Shenyang, China, in 2007. He
Nov. 2010. is currently working toward the Ph.D. degree in elec-
[31] Y. Tarabalka, J. Chanussot, and J. A. Benediktsson, “Segmentation and trical engineering in Hunan University, Changsha,
classification of hyperspectral images using minimum spanning forest China.
grown from automatically selected markers,” IEEE Trans. Syst., Man, He is currently a Visiting Ph.D. student in elec-
Cybern. B, Cybern., vol. 40, no. 5, pp. 1267–1279, Oct. 2010. trical engineering at the University of Iceland,
[32] J. Li, J. M. Bioucas-Dias, and A. Plaza, “Spectral–spatial hyperspectral Reykjavik, Iceland. He is engaged in image fusion,
image segmentation using subspace multinomial logistic regression and image superresolution, pansharpening, and hyper-
Markov random fields,” IEEE Trans. Geosci. Remote Sens., vol. 50, no. 3, spectral image classification.
pp. 809–823, Mar. 2012.
[33] C. Tomasi and R. Manduchi, “Bilateral filtering for gray and color im-
ages,” in Proc. Int. Conf. Comput. Vis., Jan. 1998, pp. 839–846.
[34] K. He, J. Sun, and X. Tang, “Guided image filtering,” in Proc. Eur. Conf.
Comput. Vision, Heraklion, Greece, Sep. 2010, pp. 1–14.
[35] S. Paris and F. Durand, “A fast approximation of the bilateral filter using a
signal processing approach,” Int. J. Comput. Vis., vol. 81, no. 1, pp. 24–52,
Jan. 2009.
[36] Z. Farbman, R. Fattal, D. Lischinski, and R. Szeliski, “Edge-preserving
decompositions for multi-scale tone and detail manipulation,” ACM
Trans. Graph., vol. 27, no. 3, pp. 67:1–67:10, Aug. 2008.
[37] K. He, J. Sun, and X. Tang, “Guided image filtering,” IEEE Trans. Pattern
Anal. Mach. Intell., vol. 35, no. 6, pp. 1397–1409, Jun. 2013. Shutao Li (M’07) received the B.Sc., M.Sc., and
[38] A. Hosni, C. Rhemann, M. Bleyer, C. Rother, and M. Gelautz, “Fast Ph.D. degrees in electrical engineering from Hunan
cost-volume filtering for visual correspondence and beyond,” IEEE Trans. University, Changsha, China, in 1995, 1997, and
Pattern Anal. Mach. Intel., vol. 35, no. 2, pp. 504–511, Feb. 2013. 2001, respectively.
[39] S. Li, X. Kang, and J. Hu, “Image fusion with guided filtering,” IEEE In 2001, he joined the College of Electrical and
Trans. Image Process., vol. 22, no. 7, pp. 2864–2875, Jul. 2013. Information Engineering, Hunan University. From
[40] S. Li and X. Kang, “Fast multi-exposure image fusion with median fil- May 2001 to October 2001, he was a Research
ter and recursive filter,” IEEE Trans. Consum. Electron., vol. 58, no. 2, Associate with the Department of Computer Science,
pp. 626–632, May 2012. Hong Kong University of Science and Technology,
[41] B. Zhang and J. P. Allebach, “Adaptive bilateral filter for sharpness en- Kowloon, Hong Kong. From November 2002 to
hancement and noise removal,” IEEE Trans. Image Process., vol. 17, November 2003, he was a Postdoctoral Fellow with
no. 5, pp. 664–678, May 2008. the Royal Holloway College, University of London, Egham, U.K., working
[42] K. He, J. Sun, and X. Tang, “Single image haze removal using dark with Prof. J.-S. Taylor. From April 2005 to June 2005, he was a Visiting
channel prior,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 33, no. 12, Professor with the Department of Computer Science, Hong Kong University
pp. 2341–2353, Dec. 2011. of Science and Technology. He is currently a Full Professor with the College
[43] C. H. Lin, J. S. Tsai, and C. T. Chiu, “Switching bilateral filter with a of Electrical and Information Engineering, Hunan University. He has authored
texture/noise detector for universal noise removal,” IEEE Trans. Image or coauthored more than 160 refereed papers. His professional interests are
Process., vol. 19, no. 9, pp. 2307–2320, Sep. 2010. information fusion, pattern recognition, and image processing.
KANG et al.: SPECTRAL–SPATIAL HYPERSPECTRAL IMAGE CLASSIFICATION 2677