Abstract
Recent research work shows that linear regression bears strong connections to many subspace learning methods such as linear discriminant analysis, locality preserving projection. When linear regression methods are applied for dimensionality reduction, a major disadvantage is that it fails to consider the geometric structure in the data. In this paper, we propose a graph regularized ridge regression for dimensionality reduction. We develop a new algorithm for affinity graph construction based on nonnegative least squares and use affinity graph to capture the neighborhood geometric structure information. The global and neighborhood structures information are modeled as a graph regularized least squares problem. We design an efficient model selection scheme for the optimal parameter estimation, which balances the tradeoff between the global and neighborhood structures. Extensive experimental studies are conducted on benchmark data sets to show the effectiveness of our approach.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Duda, R., Hart, P., Stork, D.: Pattern classification, vol. 2. Wiley, New York (2001)
Hastie, T., Tibshirani, R., Friedman, J.: The elements of statistical learning: data mining, inference, and prediction. Springer (2009)
Jolliffe, I.: MyiLibrary: Principal component analysis, vol. 2. Wiley Online Library (2002)
Fukunaga, K.: Introduction to statistical pattern recognition. Academic Press Professional (1990)
Belhumeur, P., Hespanha, J., Kriegman, D.: Eigenfaces vs. fisherfaces: Recognition using class specific linear projection. IEEE Transactions on Pattern Analysis and Machine Intelligence 19(7), 711–720 (1997)
Swets, D., Weng, J.: Using discriminant eigenfeatures for image retrieval. IEEE Transactions on Pattern Analysis and Machine Intelligence 18(8), 831–836 (1996)
Ye, J., Janardan, R., Park, C., Park, H.: An optimization criterion for generalized discriminant analysis on undersampled problems. IEEE Transactions on Pattern Analysis and Machine Intelligence 26(8), 982–994 (2004)
Dudoit, S., Fridlyand, J., Speed, T.: Comparison of discrimination methods for the classification of tumors using gene expression data. Journal of the American Statistical Association 97(457), 77–87 (2002)
Roweis, S., Saul, L.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)
Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation 15(6), 1373–1396 (2003)
Tenenbaum, J., Silva, V., Langford, J.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)
Belkin, M., Niyogi, P.: Laplacian eigenmaps and spectral techniques for embedding and clustering. In: Advances in Neural Information Processing Systems, vol. 1, pp. 585–592 (2002)
Niyogi, X.: Locality preserving projections. In: Proceedings of the 2003 Conference Advances in Neural Information Processing Systems, vol. 16, p. 153. The MIT Press (2004)
Xiaofei, H., Deng, C., Shuicheng, Y., Zhang, H.: Neighborhood preserving embedding. In: Proc. of the 10th International Conference of Computer Vision, Beijing, China, pp. 1208–1213 (2005)
Ye, J.: Least squares linear discriminant analysis. In: Proceedings of the 24th International Conference on Machine Learning, pp. 1087–1093. ACM (2007)
Cai, D., He, X., Han, J.: Spectral regression: A unified approach for sparse subspace learning. In: Proc. Int. Conf. on Data Mining, ICDM 2007 (2007)
Cai, D., He, X., Han, J.: Spectral regression for dimensionality reduction. In: Computer Science Department, UIUC, UIUCDCS-R-2007-2856 (May 2007)
Björck, A.: Numerical methods for least squares problems. Society for Industrial Mathematics (1996)
Lawson, C., Hanson, R.: Solving least squares problems. Society for Industrial Mathematics (1995)
Golub, G., Van Loan, C.: Matrix computations. Johns Hopkins Univ. Press (1996)
The UCI kdd archive
Frank, A., Asuncion, A.: UCI machine learning repository (2010)
Jin, Z., Yang, J., Hu, Z., Lou, Z.: Face recognition based on the uncorrelated discriminant transformation. Pattern Recognition 34(7), 1405–1416 (2001)
Jin, Z., Yang, J., Tang, Z., Hu, Z.: A theorem on the uncorrelated optimal discriminant vectors. Pattern Recognition 34(10), 2041–2047 (2001)
Ye, J., Xiong, T.: Computational and theoretical analysis of null space and orthogonal linear discriminant analysis. The Journal of Machine Learning Research 7, 1183–1204 (2006)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Shu, X., Lu, H. (2012). Neighborhood Structure Preserving Ridge Regression for Dimensionality Reduction. In: Liu, CL., Zhang, C., Wang, L. (eds) Pattern Recognition. CCPR 2012. Communications in Computer and Information Science, vol 321. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33506-8_4
Download citation
DOI: https://doi.org/10.1007/978-3-642-33506-8_4
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-33505-1
Online ISBN: 978-3-642-33506-8
eBook Packages: Computer ScienceComputer Science (R0)