Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse




Conference Paper

Information-theoretic Metric Learning


Kulis B, Jain P, Sra,  S
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;

There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available

Davis, J., Kulis B, Jain P, Sra, S., & Dhillon, I. (2007). Information-theoretic Metric Learning. Proceedings of the 24th Annual International Conference on Machine Learning (ICML 2007), 209-216.

Cite as:
In this paper, we present an information-theoretic approach to learning a Mahalanobis distance function. We formulate the problem as that of minimizing the differential relative entropy between two multivariate Gaussians under constraints on the distance function. We express this problem as a particular Bregman optimization problem---that of minimizing the LogDet divergence subject to linear constraints. Our resulting algorithm has several advantages over existing methods. First, our method can handle a wide variety of constraints and can optionally incorporate a prior on the distance function. Second, it is fast and scalable. Unlike most existing methods, no eigenvalue computations or semi-definite programming are required. We also present an online version and derive regret bounds for the resulting algorithm. Finally, we evaluate our method on a recent error reporting system for software called Clarify, in the context of metric learning for nearest neighbor classification, as well as on standard data sets.