English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Risk-Based Generalizations of f-divergences

MPS-Authors
/persons/resource/persons76237

von Luxburg,  U.
Research Group Machines Learning Theory, Max Planck Institute for Intelligent Systems, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

García-García, D., von Luxburg, U., & Santos-Rodríguez, R. (2011). Risk-Based Generalizations of f-divergences. In 28th International Conference on Machine Learning (ICML 2011) (pp. 417-424).


Cite as: https://hdl.handle.net/11858/00-001M-0000-0010-4CAB-9
Abstract
We derive a generalized notion of f-divergences, called (f,l)-divergences. We show that this generalization enjoys many of the nice properties of f-divergences, although it is a richer family. It also provides alternative definitions of standard divergences in terms of surrogate risks. As a first practical application of this theory, we derive a new estimator for the Kulback-Leibler divergence that we use for clustering sets of vectors.