de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Entropy Search for Information-Efficient Global Optimization

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84387

Hennig,  P
Dept. Empirical Inference, Max Planck Institute for Intelligent Systems, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84198

Schuler,  CJ
Dept. Empirical Inference, Max Planck Institute for Intelligent Systems, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Hennig, P., & Schuler, C. (2012). Entropy Search for Information-Efficient Global Optimization. Journal of Machine Learning Research, 13, 1809-1837.


Cite as: http://hdl.handle.net/11858/00-001M-0000-000E-FDA7-D
Abstract
Contemporary global optimization algorithms are based on local measures of utility, rather than a probability measure over location and value of the optimum. They thus attempt to collect low function values, not to learn about the optimum. The reason for the absence of probabilistic global optimizers is that the corresponding inference problem is intractable in several ways. This paper develops desiderata for probabilistic optimization algorithms, then presents a concrete algorithm which addresses each of the computational intractabilities with a sequence of approximations and explicitly adresses the decision problem of maximizing information gain from each evaluation.