English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

A Hilbert-Schmidt Dependence Maximization Approach to Unsupervised Structure Discovery

MPS-Authors
/persons/resource/persons83816

Blaschko,  MB
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83946

Gretton,  A
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Blaschko, M., & Gretton, A. (2008). A Hilbert-Schmidt Dependence Maximization Approach to Unsupervised Structure Discovery. In 6th International Workshop on Mining and Learning with Graphs (MLG 2008) (pp. 1-3).


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-C831-4
Abstract
In recent work by (Song et al., 2007), it has been proposed to perform clustering by maximizing a Hilbert-Schmidt independence criterion with respect to a predefined
cluster structure Y , by solving for the partition
matrix, II. We extend this approach here to the
case where the cluster structure Y is not fixed, but is
a quantity to be optimized; and we use an independence
criterion which has been shown to be more sensitive
at small sample sizes (the Hilbert-Schmidt Normalized
Information Criterion, or HSNIC, Fukumizu
et al., 2008). We demonstrate the use of this framework
in two scenarios. In the first, we adopt a cluster
structure selection approach in which the HSNIC is
used to select a structure from several candidates. In
the second, we consider the case where we discover
structure by directly optimizing Y.