English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Risk-Based Generalizations of f-divergences

MPS-Authors
/persons/resource/persons76237

von Luxburg,  U
Dept. Empirical Inference, Max Planck Institute for Intelligent Systems, Max Planck Society;

External Resource

http://www.icml-2011.org/
(Table of contents)

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

García-García, D., von Luxburg, U., & Santos-Rodríguez, R. (2011). Risk-Based Generalizations of f-divergences. In L. Getoor (Ed.), 28th International Conference on Machine Learning (ICML 2011) (pp. 417-424). Madison, WI, USA: International Machine Learning Society.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-BB2E-6
Abstract
We derive a generalized notion of f-divergences, called (f,l)-divergences. We show that this generalization enjoys many of the nice properties of f-divergences, although it is a richer family. It also provides alternative definitions of standard divergences in terms of surrogate risks. As a first practical application of this theory, we derive a new estimator for the Kulback-Leibler divergence that we use for clustering sets of vectors.