Abstract
Many linear discriminant analysis (LDA) and kernel Fisher discriminant analysis (KFD) methods are based on the restrictive assumption that the data are homoscedastic. In this paper, we propose a new KFD method called heteroscedastic kernel weighted discriminant analysis (HKWDA) which has several appealing characteristics. First, like all kernel methods, it can handle nonlinearity efficiently in a disciplined manner. Second, by incorporating a weighting function that can capture heteroscedastic data distributions into the discriminant criterion, it can work under more realistic situations and hence can further enhance the classification accuracy in many real-world applications. Moreover, it can effectively deal with the small sample size problem. We have performed some face recognition experiments to compare HKWDA with several linear and nonlinear dimensionality reduction methods, showing that HKWDA consistently gives the best results.
Chapter PDF
Similar content being viewed by others
Keywords
- Face Recognition
- Kernel Principal Component Analysis
- Discriminatory Information
- Discriminant Criterion
- Small Sample Size Problem
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Kumar, N., Andreou, A.G.: Heteroscedastic discriminant analysis and reduced rank HMMS for improved speech recognition. Speech Communication 26, 283–297 (1998)
Hastie, T., Tibshirani, R.: Discriminant analysis by Gaussian mixture. Journal of the Royal Statistical Society, Series B 58, 155–176 (1996)
Loog, M., Duin, R.P.W.: Linear dimensionality reduction via a heteroscedastic extension of LDA: the Chernoff criterion. IEEE Transactions on Pattern Analysis and Machine Intelligence 26(6), 32–739 (2004)
Li, Y.X., Gao, Y.Q., Erdogan, H.: Weighted pairwise scatter to improve linear discriminant analysis. In: Proceedings of the 6th International Conference on Spoken Language Processing (2000)
Lotlikar, R., Kothari, R.: Fractional-step dimensionality reduction. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(6), 623–627 (2000)
Lu, J.W., Plataniotis, K.N., Venetsanopoulos, A.N.: Face recognition using LDA-based algorithms. IEEE Transactions on Neural Networks 14, 195–200 (2003)
Loog, M., Duin, R.P.W., Haeb-Umbach, R.: Multiclass linear dimension reduction by weighted pairwise Fisher criteria. IEEE Transactions on Pattern Analysis and Machine Intelligence 23(7), 762–766 (2001)
Qin, A.K., Suganthan, P.N., Loog, M.: Uncorrelated heteroscedastic LDA based on the weighted pairwise Chernoff criterion. Pattern Recognition (2005)
Belhumeur, P.N., Hespanha, J.P., Kriegman, D.J.: Eigenfaces vs. Fisherfaces: recognition using class specific linear projection. IEEE Transactions on Pattern Analysis and Machine Intelligence 19, 711–720 (1997)
Chen, L.F., Liao, H.Y.M., Ko, M.T., Lin, J.C., Yu, G.J.: A new LDA-based face recognition system which can solve the small sample size problem. Pattern Recognition 33, 1713–1726 (2000)
Yu, H., Yang, J.: A direct LDA algorithm for high-dimensional data with application to face recognition. Pattern Recognition 34, 2067–2070 (2001)
Huang, R., Liu, Q., Lu, H., Ma, S.: Solving the small size problem of LDA. In: Proceedings of the Sixteenth International Conference on Pattern Recognition, vol. 3, pp. 29–32 (2002)
Yang, J., Yang, J.Y.: Why can LDA be performed in PCA transformed space? Pattern Recognition 36, 563–566 (2003)
Cevikalp, H., Neamtu, M., Wilkes, M., Barkana, A.: Discriminative common vectors for face recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(1), 4–13 (2005)
Schölkopf, B., Smola, A., Müller, K.R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 10, 1299–1319 (1999)
Mika, S., Rätsch, G., Weston, J., Schölkopf, B., Müller, K.R.: Fisher discriminant analysis with kernels. In: Hu, Y.H., Larsen, J., Wilson, E., Douglas, S. (eds.) Proceedings of the Neural Networks for Signal Processing IX, pp. 41–48 (1999)
Baudat, G., Anouar, F.: Generalized discriminant analysis using a kernel approach. Neural Computation 12, 2385–2404 (2000)
Xiong, T., Ye, J.P., Li, Q., Cherkassky, V., Janardan, R.: Efficient kernel discriminant analysis via QR decomposition. In: Advances in Neural Information Processing Systems 17 (2005)
Park, C.H., Park, H.: Nonlinear discriminant analysis using kernel functions and the generalized singular value decomposition. SIAM Journal on Matrix Analysis and Application (to appear), http://www-users.cs.umn.edu/~hpark/pub.html
Yang, M.H.: Kernel eigenfaces vs. kernel Fisherfaces: face recognition using kernel methods. In: Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition, May 2002, pp. 215–220 (2002)
Lu, J.W., Plataniotis, K.N., Venetsanopoulos, A.N.: Face recognition using kernel direct discriminant analysis algorithms. IEEE Transactions on Neural Networks 12, 117–126 (2003)
Dai, G., Qian, Y.T.: Modified kernel-based nonlinear feature extraction. In: Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, pp. 17–21 (2004)
Dai, G., Qian, Y.T.: Kernel generalized nonlinear discriminant analysis algorithm for pattern recognition. In: Proceedings of the IEEE International Conference on Image Processing, pp. 2697–2700 (2004)
Yang, J., Frangi, A.F., Yang, J.Y., Zhang, D., Jin, Z.: KPCA plus LDA: a complete kernel Fisher discriminant framework for feature extraction and recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(2), 230–244 (2005)
Dai, G., Yeung, D.Y.: Nonlinear dimensionality reduction for classification using kernel weighted subspace method. In: Proceedings of the IEEE International Conference on Image Processing, pp. 838–841 (2005)
Zhou, S.H., Chellappa, R.: From sample similarity to ensemble similarity. Technical Report SCR Technical Report (SCR-05-TR-774), Maryland University (2005), http://www.umiacs.umd.edu/~shaohua/
Tipping, M.E., Bishop, C.M.: Probabilistic principal component analysis. Journal of the Royal Statistical Society, Series B 61(3), 611–622 (1999)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Dai, G., Yeung, DY., Chang, H. (2006). Extending Kernel Fisher Discriminant Analysis with the Weighted Pairwise Chernoff Criterion. In: Leonardis, A., Bischof, H., Pinz, A. (eds) Computer Vision – ECCV 2006. ECCV 2006. Lecture Notes in Computer Science, vol 3954. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11744085_24
Download citation
DOI: https://doi.org/10.1007/11744085_24
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-33838-3
Online ISBN: 978-3-540-33839-0
eBook Packages: Computer ScienceComputer Science (R0)