Abstract
Metric learning is the task of learning a distance metric from training data that reasonably identifies the important relationships between the data. An appropriate distance metric is of considerable importance for building accurate classifiers. In this paper, we propose a novel supervised metric learning method, nearest hit-misses component analysis. In our method, the margin is first defined with respect to the nearest hits (nearest neighbors from the same class) and the nearest misses (nearest neighbors from the different class), and then the distance metric is trained by maximizing the margin while minimizing the distance between each sample and its nearest hits. We further introduce a regularization term to alleviate overfitting. Moreover, the proposed method can perform metric learning and dimensionality reduction simultaneously. Comparative experiments with the state-of-the-art metric learning methods on various real-world data sets demonstrate the effectiveness of the proposed method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Fisher, R.: The use of multiple measures in taxonomic problems. Annals of Eugenics 7, 179–188 (1936)
Xu, Y., Yang, J.Y., Jin, Z.: Theory analysis on fslda and ulda. Pattern Recognition 36, 3031–3033 (2003)
Hastie, T., Tibshirani, R.: Discriminant adaptive nearest neighbor classification. IEEE Trans. Pattern Analysis and Machine Intelligence 18(6), 607–616 (1996)
Xing, E., Ng, A., Jordan, M., Russell, S.: Distance metric learning with application to clustering with side-information. In: Xing, E., Ng, A., Jordan, M., Russell, S. (eds.) Advances in neural information processing systems, pp. 521–528 (2003)
Bar-Hillel, A., Hertz, T., Shental, N., Weinshall, D.: Learning a mahalanobis metric from equivalence constraints. Journal of Machine Learning Research 6(1), 937–965 (2006)
Hoi, S., Liu, W., Lyu, M., Ma, W.: Learning distance metrics with contextual constraints for image retrieval. In: Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 2 (2006)
Sugiyama, M.: Dimensionality reduction of multimodal labeled data by local Fisher discriminant analysis. Journal of Machine Learning Research 8, 1027–1061 (2007)
Davis, J., Kulis, B., Jain, P., Sra, S., Dhillon, I.: Information-theoretic metric learning. In: Proceedings of the 24th International Conference on Machine learning, ACM, New York (2007)
Yang, L., Jin, R.: Distance metric learning: A comprehensive survey. Michigan State University (2006)
Goldberger, J., Roweis, S., Hinton, G., Salakhutdinov, R.: Neighbourhood components analysis. In: Advances in Neural Information Processing Systems, vol. 17, pp. 513–520. MIT Press, Cambridge (2005)
Weinberger, K., Blitzer, J., Saul, L.: Distance metric learning for large margin nearest neighbor classification. In: Advances in Neural Information Processing Systems, vol. 18, pp. 1473–1480. MIT Press, Cambridge (2006)
Torresani, L., Lee, K.: Large margin component analysis. Advances in neural information processing systems 19 (2007)
Frank, A., Asuncion, A.: UCI machine learning repository (2010)
Cai, D., He, X., Han, J., Zhang, H.: Orthogonal laplacianfaces for face recognition. IEEE Trans. Image Processing 15(11), 3608–3614 (2006)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Yang, W., Wang, K., Zuo, W. (2010). Nearest Hit-Misses Component Analysis for Supervised Metric Learning. In: Wong, K.W., Mendis, B.S.U., Bouzerdoum, A. (eds) Neural Information Processing. Theory and Algorithms. ICONIP 2010. Lecture Notes in Computer Science, vol 6443. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-17537-4_47
Download citation
DOI: https://doi.org/10.1007/978-3-642-17537-4_47
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-17536-7
Online ISBN: 978-3-642-17537-4
eBook Packages: Computer ScienceComputer Science (R0)