hide
Free keywords:
-
Abstract:
We introduce a new contrast function, the kernel mutual information
(KMI), to measure the degree of independence of continuous random
variables. This contrast function provides an approximate upper bound
on the mutual information, as measured near independence, and is based
on a kernel density estimate of the mutual information between a discretised
approximation of the continuous random variables. We show that Bach
and Jordanlsquo;s kernel generalised variance (KGV) is also an upper bound
on the same kernel density estimate, but is looser. Finally, we suggest
that the addition of a regularising term in the KGV causes it to approach
the KMI, which motivates the introduction of this regularisation.