# Item

ITEM ACTIONSEXPORT

Released

Conference Paper

#### Regression by dependence minimization and its application to causal inference in additive noise models

##### MPS-Authors

##### Locator

There are no locators available

##### Fulltext (public)

There are no public fulltexts available

##### Supplementary Material (public)

There is no public supplementary material available

##### Citation

Mooij, J., Janzing, D., Peters, J., & Schölkopf, B. (2009). Regression by dependence
minimization and its application to causal inference in additive noise models.* Proceedings of the
26th International Conference on Machine Learning (ICML 2009),* 745-752.

Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-C4A3-8

##### Abstract

Motivated by causal inference problems, we
propose a novel method for regression that
minimizes the statistical dependence between
regressors and residuals. The key advantage
of this approach to regression is that it does
not assume a particular distribution of the
noise, i.e., it is non-parametric with respect
to the noise distribution. We argue that the
proposed regression method is well suited to
the task of causal inference in additive noise
models. A practical disadvantage is that the
resulting optimization problem is generally
non-convex and can be difficult to solve. Nevertheless,
we report good results on one of the
tasks of the NIPS 2008 Causality Challenge,
where the goal is to distinguish causes from
effects in pairs of statistically dependent variables.
In addition, we propose an algorithm
for efficiently inferring causal models from
observational data for more than two variables.
The required number of regressions
and independence tests is quadratic in the
number of variables, which is a significant improvement
over the simple method that tests
all possible DAGs.