Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3292500.3330975acmconferencesArticle/Chapter ViewAbstractPublication PageskddConference Proceedingsconference-collections
research-article

A Free Energy Based Approach for Distance Metric Learning

Published: 25 July 2019 Publication History

Abstract

We present a reformulation of the distance metric learning problem as a penalized optimization problem, with a penalty term corresponding to the von Neumann entropy of the distance metric. This formulation leads to a mapping to statistical mechanics such that the metric learning optimization problem becomes equivalent to free energy minimization. Correspondingly, our approach leads to an analytical solution of the optimization problem based on the Boltzmann distribution. The mapping established in this work suggests new approaches for dimensionality reduction and provides insights into determination of optimal parameters for the penalty term. Furthermore, we demonstrate that the metric projects the data onto direction of maximum dissimilarity with optimal and tunable separation between classes and thus the transformation can be used for high dimensional data visualization, classification, and clustering tasks. We benchmark our method against previous distance learning methods and provide an efficient implementation in an R package available to download at: \urlhttps://github.com/kouroshz/fenn

References

[1]
Aharon Bar-Hillel, Tomer Hertz, Noam Shental, and Daphna Weinshall. 2003. Learning Distance Functions Using Equivalence Relations. In In Proceedings of the Twentieth International Conference on Machine Learning. 11--18.
[2]
Jonathan Baxter and Peter L. Bartlett. 1998. The Canonical Distortion Measure in Feature Space and 1-NN Classification. In Advances in Neural Information Processing Systems 10, M. I. Jordan, M. J. Kearns, and S. A. Solla (Eds.). MIT Press, 245--251. http://papers.nips.cc/paper/1357-the-canonical-distortion-measure-in-feature-space-and-1-nn-classification.pdf
[3]
Aurélien Bellet, Amaury Habrard, and Marc Sebban. 2013. A Survey on Metric Learning for Feature Vectors and Structured Data . arXiv:1306.6709 {cs, stat} (June 2013). http://arxiv.org/abs/1306.6709 arXiv: 1306.6709.
[4]
Gavin E Crooks. 2011. Fisher information and statistical mechanics . Technical Report. Citeseer.
[5]
Jason V. Davis, Brian Kulis, Prateek Jain, Suvrit Sra, and Inderjit S. Dhillon. 2007. Information-theoretic Metric Learning. In Proceedings of the 24th International Conference on Machine Learning (ICML '07). ACM, New York, NY, USA, 209--216.
[6]
C. Domeniconi, Jing Peng, and D. Gunopulos. 2002. Locally adaptive metric nearest-neighbor classification. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 24, 9 (Sept. 2002), 1281--1285.
[7]
Carl Tony Fakhry. 2018. Causal Reasoning and Machine Learning Models for Cellular Regulatory Mechanisms . ScholarWorks at UMass Boston (2018). https://scholarworks.umb.edu/doctoral_dissertations/382
[8]
Jerome H. Friedman. 1994. Flexible Metric Nearest Neighbor Classification . ResearchGate (Dec 1994). https://www.researchgate.net/publication/2696649_Flexible_Metric_Nearest_Neighbor_Classification
[9]
Jacob Goldberger, Geoffrey E Hinton, Sam T. Roweis, and Ruslan R Salakhutdinov. 2005. Neighbourhood Components Analysis . In Advances in Neural Information Processing Systems 17, L. K. Saul, Y. Weiss, and L. Bottou (Eds.). MIT Press, 513--520. http://papers.nips.cc/paper/2566-neighbourhood-components-analysis.pdf
[10]
John Hansen. 2005. Using SPSS for windows and macintosh: analyzing and understanding data.
[11]
Trevor Hastie and Robert Tibshirani. 1996. Discriminant Adaptive Nearest Neighbor Classification and Regression . In Advances in Neural Information Processing Systems 8, D. S. Touretzky and M. E. Hasselmo (Eds.). MIT Press, 409--415. http://papers.nips.cc/paper/1131-discriminant-adaptive-nearest-neighbor-classification-and-regression.pdf
[12]
Robert D. Short II and Keinosuke Fukunaga. 1981. The optimal distance measure for nearest neighbor classification . Information Theory, IEEE Transactions on, Vol. 27, 5 (Oct 1981), 622--627.
[13]
Mehran Kardar. 2007. Statistical Physics of Particles .Cambridge University Press.
[14]
Brian Kulis. 2013. Metric Learning: A Survey . MAL, Vol. 5, 4 (Jul 2013), 287--364.
[15]
Moshe Lichman. 2013. UCI Machine Learning Repository . https://archive.ics.uci.edu/ml/index.php
[16]
Yang Mu, Wei Ding, and Dacheng Tao. 2013. Local discriminative distance metrics ensemble learning. Pattern Recognition, Vol. 46, 8 (2013), 2337--2349.
[17]
Guo-Jun Qi, Jinhui Tang, Zheng-Jun Zha, Tat-Seng Chua, and Hong-Jiang Zhang. 2009. An Efficient Sparse Metric Learning in High-dimensional Space via L1-penalized Log-determinant Regularization. In Proceedings of the 26th Annual International Conference on Machine Learning (ICML '09). ACM, New York, NY, USA, 841--848.
[18]
C. Radhakrishna Rao. 1948. The Utilization of Multiple Measurements in Problems of Biological Classification. Journal of the Royal Statistical Society. Series B (Methodological), Vol. 10, 2 (1948), 159--203. http://www.jstor.org/stable/2983775
[19]
C Radhakrishna Rao. 1992. Information and the accuracy attainable in the estimation of statistical parameters. In Breakthroughs in statistics . Springer, 235--247.
[20]
Masashi Sugiyama. 2006. Local Fisher Discriminant Analysis for Supervised Dimensionality Reduction. In Proceedings of the 23rd International Conference on Machine Learning (ICML '06). ACM, New York, NY, USA, 905--912.
[21]
Masashi Sugiyama. 2007. Dimensionality Reduction of Multimodal Labeled Data by Local Fisher Discriminant Analysis . J. Mach. Learn. Res., Vol. 8 (May 2007), 1027--1061. http://dl.acm.org/citation.cfm?id=1248659.1248694
[22]
Fei Wang and Jimeng Sun. 2014. Survey on distance metric learning and dimensionality reduction in data mining. Data Mining and Knowledge Discovery, Vol. 29, 2 (June 2014), 534--564.
[23]
Shijun Wang and Rong Jin. 2009. An Information Geometry Approach for Distance Metric Learning. In Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics (Proceedings of Machine Learning Research), David van Dyk and Max Welling (Eds.), Vol. 5. PMLR, Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA, 591--598. http://proceedings.mlr.press/v5/wang09c.html
[24]
Kilian Q Weinberger, John Blitzer, and Lawrence K. Saul. 2006. Distance Metric Learning for Large Margin Nearest Neighbor Classification . In Advances in Neural Information Processing Systems 18, Y. Weiss, B. Schölkopf, and J. C. Platt (Eds.). MIT Press, 1473--1480. http://papers.nips.cc/paper/2795-distance-metric-learning-for-large-margin-nearest-neighbor-classification.pdf
[25]
Kilian Q. Weinberger and Lawrence K. Saul. 2009. Distance Metric Learning for Large Margin Nearest Neighbor Classification . Journal of Machine Learning Research, Vol. 10, Feb (2009), 207--244. http://www.jmlr.org/papers/v10/weinberger09a.html
[26]
Eric P. Xing, Michael I. Jordan, Stuart J Russell, and Andrew Y. Ng. 2003. Distance Metric Learning with Application to Clustering with Side-Information . In Advances in Neural Information Processing Systems 15, S. Becker, S. Thrun, and K. Obermayer (Eds.). MIT Press, 521--528. http://papers.nips.cc/paper/2164-distance-metric-learning-with-application-to-clustering-with-side-information.pdf
[27]
Yiming Ying and Peng Li. 2012. Distance Metric Learning with Eigenvalue Optimization . Journal of Machine Learning Research, Vol. 13, Jan (2012), 1--26. http://www.jmlr.org/papers/v13/ying12a.html

Cited By

View all
  • (2023)UNIT: A unified metric learning framework based on maximum entropy regularizationApplied Intelligence10.1007/s10489-023-04831-x53:20(24509-24529)Online publication date: 26-Jul-2023
  • (2022)A New Similarity Space Tailored for Supervised Deep Metric LearningACM Transactions on Intelligent Systems and Technology10.1145/355976614:1(1-25)Online publication date: 9-Nov-2022
  • (2022)On the Robustness of Metric Learning: An Adversarial PerspectiveACM Transactions on Knowledge Discovery from Data10.1145/350272616:5(1-25)Online publication date: 5-Apr-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
KDD '19: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining
July 2019
3305 pages
ISBN:9781450362016
DOI:10.1145/3292500
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike International 4.0 License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 25 July 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. dimensionality reduction
  2. distance metric learning
  3. high-dimensional data visualization

Qualifiers

  • Research-article

Funding Sources

Conference

KDD '19
Sponsor:

Acceptance Rates

KDD '19 Paper Acceptance Rate 110 of 1,200 submissions, 9%;
Overall Acceptance Rate 1,133 of 8,635 submissions, 13%

Upcoming Conference

KDD '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)37
  • Downloads (Last 6 weeks)7
Reflects downloads up to 26 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2023)UNIT: A unified metric learning framework based on maximum entropy regularizationApplied Intelligence10.1007/s10489-023-04831-x53:20(24509-24529)Online publication date: 26-Jul-2023
  • (2022)A New Similarity Space Tailored for Supervised Deep Metric LearningACM Transactions on Intelligent Systems and Technology10.1145/355976614:1(1-25)Online publication date: 9-Nov-2022
  • (2022)On the Robustness of Metric Learning: An Adversarial PerspectiveACM Transactions on Knowledge Discovery from Data10.1145/350272616:5(1-25)Online publication date: 5-Apr-2022
  • (2021)Metric Learning via Penalized OptimizationProceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining10.1145/3447548.3467369(656-664)Online publication date: 14-Aug-2021

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media