English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Entropy Search for Information-Efficient Global Optimization

MPS-Authors
/persons/resource/persons84387

Hennig,  P
Dept. Empirical Inference, Max Planck Institute for Intelligent Systems, Max Planck Society;

/persons/resource/persons84198

Schuler,  CJ
Dept. Empirical Inference, Max Planck Institute for Intelligent Systems, Max Planck Society;

External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Hennig, P., & Schuler, C. (2012). Entropy Search for Information-Efficient Global Optimization. Journal of Machine Learning Research, 13, 1809-1837.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-B710-C
Abstract
Contemporary global optimization algorithms are based on local measures of utility, rather than a probability measure over location and value of the optimum. They thus attempt to collect low function values, not to learn about the optimum. The reason for the absence of probabilistic global optimizers is that the corresponding inference problem is intractable in several ways. This paper develops desiderata for probabilistic optimization algorithms, then presents a concrete algorithm which addresses each of the computational intractabilities with a sequence of approximations and explicitly adresses the decision problem of maximizing information gain from each evaluation.