de.mpg.escidoc.pubman.appbase.FacesBean
Deutsch
 
Hilfe Wegweiser Impressum Kontakt Einloggen
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Konferenzbeitrag

Tracking a Small Set of Experts by Mixing Past Posteriors

MPG-Autoren
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83824

Bousquet,  O
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine Externen Ressourcen verfügbar
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Bousquet, O. (2001). Tracking a Small Set of Experts by Mixing Past Posteriors. In Proceedings of the 14th Annual Conference on Computational Learning Theory, Lecture Notes in Computer Science (pp. 31-47).


Zitierlink: http://hdl.handle.net/11858/00-001M-0000-0013-E380-B
Zusammenfassung
In this paper, we examine on-line learning problems in which the target concept is allowed to change over time. In each trial a master algorithm receives predictions from a large set of n experts. Its goal is to predict almost as well as the best sequence of such experts chosen off-line by partitioning the training sequence into k+1 sections and then choosing the best expert for each section. We build on methods developed by Herbster and Warmuth and consider an open problem posed by Freund where the experts in the best partition are from a small pool of size m. Since k>>m the best expert shifts back and forth between the experts of the small pool. We propose algorithms that solve this open problem by mixing the past posteriors maintained by the master algorithm. We relate the number of bits needed for encoding the best partition to the loss bounds of the algorithms. Instead of paying \log n for choosing the best expert in each section we first pay \log n\choose m bits in the bounds for identifying the pool of m experts and then \log m bits per new section. In the bounds we also pay twice for encoding the boundaries of the sections.