日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細

登録内容を編集ファイル形式で保存
 
 
ダウンロード電子メール
  Kernel Choice and Classifiability for RKHS Embeddings of Probability Distributions

Sriperumbudur, B., Fukumizu, K., Gretton, A., Lanckriet, G., & Schölkopf, B. (2010). Kernel Choice and Classifiability for RKHS Embeddings of Probability Distributions. Advances in Neural Information Processing Systems 22: 23rd Annual Conference on Neural Information Processing Systems 2009, 1750-1758.

Item is

基本情報

表示: 非表示:
資料種別: 会議論文

ファイル

表示: ファイル

関連URL

表示:

作成者

表示:
非表示:
 作成者:
Sriperumbudur, BK1, 2, 著者           
Fukumizu, K1, 著者           
Gretton, A1, 著者           
Lanckriet, GRG, 著者
Schölkopf, B1, 著者           
Bengio, 編集者
Y., 編集者
Schuurmans, D., 編集者
Lafferty, J., 編集者
Williams, C., 編集者
Culotta, A., 編集者
所属:
1Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497795              
2Dept. Empirical Inference, Max Planck Institute for Intelligent System, Max Planck Society, ou_1497647              

内容説明

表示:
非表示:
キーワード: -
 要旨: Embeddings of probability measures into reproducing kernel Hilbert spaces have been proposed as a straightforward and practical means of representing and comparing probabilities. In particular, the distance between embeddings (the maximum mean discrepancy, or MMD) has several key advantages over many classical metrics on distributions, namely easy computability, fast convergence and low bias of finite sample estimates. An important requirement of the embedding RKHS is that it be characteristic: in this case, the MMD between two distributions is zero if and only if the distributions coincide. Three new results on the MMD are introduced in the present study. First, it is established that MMD corresponds to the optimal risk of a kernel classifier, thus forming a natural link between the distance between distributions and their ease of classification. An important consequence is that a kernel must be characteristic to guarantee classifiability between distributions in the RKHS. Second, the class of characteristic kernels is broadened to incorporate all strictly positive definite kernels: these include non-translation invariant kernels and kernels on non-compact domains. Third, a generalization of the MMD is proposed for families of kernels, as the supremum over MMDs on a class of kernels (for instance the Gaussian kernels with different bandwidths). This extension is necessary to obtain a single distance measure if a large selection or class of characteristic kernels is potentially appropriate. This generalization is reasonable, given that it corresponds to the problem of learning the kernel by minimizing the risk of the corresponding kernel classifier. The generalized MMD is shown to have consistent finite sample estimates, and its performance is demonstrated on a homogeneity testing example.

資料詳細

表示:
非表示:
言語:
 日付: 2010-04
 出版の状態: 出版
 ページ: -
 出版情報: -
 目次: -
 査読: -
 識別子(DOI, ISBNなど): ISBN: 978-1-615-67911-9
URI: http://nips.cc/Conferences/2009/
BibTex参照ID: 6131
 学位: -

関連イベント

表示:
非表示:
イベント名: 23rd Annual Conference on Neural Information Processing Systems (NIPS 2009)
開催地: Vancouver, BC, Canada
開始日・終了日: -

訴訟

表示:

Project information

表示:

出版物 1

表示:
非表示:
出版物名: Advances in Neural Information Processing Systems 22: 23rd Annual Conference on Neural Information Processing Systems 2009
種別: 学術雑誌
 著者・編者:
所属:
出版社, 出版地: Red Hook, NY, USA : Curran
ページ: - 巻号: - 通巻号: - 開始・終了ページ: 1750 - 1758 識別子(ISBN, ISSN, DOIなど): -