日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細

  Causal Markov condition for submodular information measures

Steudel, B., Janzing, D., & Schölkopf, B. (2010). Causal Markov condition for submodular information measures. In 23rd Annual Conference on Learning Theory (COLT 2010) (pp. 464-476). Madison, WI, USA: OmniPress.

Item is

基本情報

表示: 非表示:
資料種別: 会議論文

ファイル

表示: ファイル

関連URL

表示:

作成者

表示:
非表示:
 作成者:
Steudel, B1, 2, 著者           
Janzing, D3, 著者           
Schölkopf, B1, 著者           
Kalai M. Mohri, A.T., 編集者
所属:
1Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497795              
2Dept. Empirical Inference, Max Planck Institute for Intelligent System, Max Planck Society, ou_1497647              
3Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497795              

内容説明

表示:
非表示:
キーワード: -
 要旨: The causal Markov condition (CMC) is a postulate that links observations to causality. It describes the conditional independences among the observations that are entailed by a causal hypothesis in terms of a directed acyclic graph. In the conventional setting, the observations are random variables and the independence is a statistical one, i.e., the information content of observations is measured in terms of Shannon entropy. We formulate a generalized CMC for any kind of observations on which independence is defined via an arbitrary submodular information measure. Recently, this has been discussed for observations in terms of binary strings where information is understood in the sense of Kolmogorov complexity. Our approach enables us to find computable alternatives to Kolmogorov complexity, e.g., the length of a text after applying existing data compression schemes. We show that our CMC is justified if one restricts the attention to a class of causal mechanisms that is adapted to the respective information measure. Our justification is similar to deriving the statistical CMC from functional models of causality, where every variable is a deterministic function of its observed causes and an unobserved noise term. Our experiments on real data demonstrate the performance of compression based causal inference.

資料詳細

表示:
非表示:
言語:
 日付: 2010-06
 出版の状態: 出版
 ページ: -
 出版情報: -
 目次: -
 査読: -
 識別子(DOI, ISBNなど): URI: http://www.colt2010.org/
BibTex参照ID: 6772
 学位: -

関連イベント

表示:
非表示:
イベント名: 23rd Annual Conference on Learning Theory (COLT 2010)
開催地: Haifa, Israel
開始日・終了日: -

訴訟

表示:

Project information

表示:

出版物 1

表示:
非表示:
出版物名: 23rd Annual Conference on Learning Theory (COLT 2010)
種別: 会議論文集
 著者・編者:
所属:
出版社, 出版地: Madison, WI, USA : OmniPress
ページ: - 巻号: - 通巻号: - 開始・終了ページ: 464 - 476 識別子(ISBN, ISSN, DOIなど): -