Abstract
Support Vector Machine (SVM) is a powerful classification methodology where the Support Vectors (SVs) fully describe the decision surface by incorporating local information. On the other hand, Nonparametric Discriminant Analysis (NDA) is an improvement over the more general Linear Discriminant Analysis (LDA) where the normality assumption from LDA is relaxed. NDA is also based on detecting the dominant normal directions to the decision surface. This paper introduces a novel SVM + NDA model which combines these two methods. This can be viewed as an extension to the SVM by incorporating some partially global information about the data, especially, discriminatory information in the normal direction to the decision boundary. This can also be considered as an extension to the NDA where the support vectors improve the choice of κ-nearest neighbors (κ− NN’s) on the decision boundary by incorporating local information. Since our model is an extension to both SVM and NDA, it can deal with heteroscedastic and non-normal data. It also avoids the small sample size problem. Moreover, this model can be reduced to the classical SVM model so that the existing SVM programs can be used for easy implementation. An extensive comparison of the SVM + NDA to the LDA, SVM, NDA and the combined SVM and LDA, performed on artificial and real data sets, has shown the advantages and superiority of our proposed model. In particular, the experiments on face recognition have clearly shown a significant improvement of SVM + NDA over the other methods, especially, SVM and NDA.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Belhumeur, P.N., Hespanha, J.P., Kriegman, D.J.: Eigenfaces vs. Fisherfaces: recognition using class specific linear projection. IEEE Transactions on Pattern Analysis and Machine Intelligence 19, 711–720 (1997)
Fukunaga, K.: Introduction to Statistical Pattern Recognition, 2nd edn. Academic Press, London (2000)
Georghiades, A.S.: Yale face database. Center for Computational Vision and Control. Yale University, New Haven (1997)
Golub, G.H., Van Loan, C.F.: Matrix Computations, 3rd edn. The Johns Hopkins University Press, Baltimore (1996)
Kuhn, H.W., Tucker, A.W.: Nonlinear programming. In: Proceedings of the 2nd Berkeley Symposium, pp. 481–492 (1950)
Lee, C., Landgrebe, D.A.: Feature Extraction Based on Decision Boundaries. IEEE Transactions on Pattern Analysis and Machine Intelligence 15(4), 388–400 (1993)
MATLAB Bioinformatics ToolboxTM. The MathWorksTM (2009)
Mika, S., Ratsch, G., Weston, J., Scholkopf, B., Mullers, K.R.: Fisher Discriminant Analysis with Kernels. In: Proceedings of IEEE Neural Networks for Signal Processing Workshop, pp. 41–48 (1999)
Ratsch, G., Onoda, T., Muller, K.R.: Soft Margins for Adaboost. Machine Learning 42(3), 287–320 (2000)
Vapnik, V.N.: Statistical Learning Theory. John Wiley & Sons, New York (1998)
Xiong, T., Cherkassky, V.: A Combined SVM and LDA Approach for Classification. In: Proceedings of the International Joint Conference on Neural Networks, pp. 1455–1459 (2005)
Yu, H., Yang, J.: A direct LDA algorithm for high-dimensional data with application to face recognition. Pattern Recognition 34, 2067–2070 (2001)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Khan, N.M., Ksantini, R., Ahmad, I.S., Boufama, B. (2010). A New SVM + NDA Model for Improved Classification and Recognition. In: Campilho, A., Kamel, M. (eds) Image Analysis and Recognition. ICIAR 2010. Lecture Notes in Computer Science, vol 6111. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-13772-3_14
Download citation
DOI: https://doi.org/10.1007/978-3-642-13772-3_14
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-13771-6
Online ISBN: 978-3-642-13772-3
eBook Packages: Computer ScienceComputer Science (R0)