de.mpg.escidoc.pubman.appbase.FacesBean
Deutsch
 
Hilfe Wegweiser Impressum Kontakt Einloggen
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Forschungspapier

Discrete-Continuous Splitting for Weakly Supervised Learning

MPG-Autoren
http://pubman.mpdl.mpg.de/cone/persons/resource/persons201523

Lange,  Jan-Hendrik
Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons98382

Andres,  Bjoern
Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society;

Externe Ressourcen
Es sind keine Externen Ressourcen verfügbar
Volltexte (frei zugänglich)

arXiv:1705.05020.pdf
(Preprint), 542KB

Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Laude, E., Lange, J.-H., Schmidt, F. R., Andres, B., & Cremers, D. (2017). Discrete-Continuous Splitting for Weakly Supervised Learning. Retrieved from http://arxiv.org/abs/1705.05020.


Zitierlink: http://hdl.handle.net/11858/00-001M-0000-002D-8B47-A
Zusammenfassung
This paper introduces a novel algorithm for a class of weakly supervised learning tasks. The considered tasks are posed as joint optimization problems in the continuous model parameters and the (a-priori unknown) discrete label variables. In contrast to prior approaches such as convex relaxations, we decompose the nonconvex problem into purely discrete and purely continuous subproblems in a way that is amenable to distributed optimization by the Alternating Direction Method of Multipliers (ADMM). This approach preserves integrality of the discrete label variables and, for a reparameterized variant of the algorithm using kernels, guarantees global convergence to a critical point. The resulting method implicitly alternates between a discrete and a continuous variable update, however, it is inherently different from a discrete-continuous coordinate descent scheme (hard EM). In diverse experiments we show that our method can learn a classifier from weak supervision that takes the form of hard and soft constraints on the labeling and outperforms hard EM in this task.