English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Report

The Kernel Mutual Information

MPS-Authors
/persons/resource/persons83946

Gretton,  A
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)

grehersmo03.pdf
(Any fulltext), 920KB

Supplementary Material (public)
There is no public supplementary material available
Citation

Gretton, A., Herbrich, R., & Smola, A.(2003). The Kernel Mutual Information. Tübingen, Germany: Max Planck Institute for Biological Cybernetics.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-DCBB-2
Abstract
We introduce two new functions, the kernel covariance (KC) and the kernel
mutual information (KMI), to measure the degree of independence of several
continuous random variables.
The former is guaranteed to be zero if and only if the random variables
are pairwise independent; the latter shares this property, and is in addition
an approximate upper bound on the mutual information, as measured near
independence, and is based on a kernel density estimate.
We show that Bach and Jordan‘s kernel generalised variance (KGV) is also
an upper bound on the same kernel density estimate, but is looser.
Finally, we suggest that the addition of a regularising term in the KGV
causes it to approach the KMI, which motivates the introduction of this
regularisation.
The performance of the KC and KMI is verified in the context of instantaneous
independent component analysis (ICA), by recovering both artificial and
real (musical) signals following linear mixing.