Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT
  Discrete-Continuous Splitting for Weakly Supervised Learning

Laude, E., Lange, J.-H., Schmidt, F. R., Andres, B., & Cremers, D. (2017). Discrete-Continuous Splitting for Weakly Supervised Learning. Retrieved from http://arxiv.org/abs/1705.05020.

Item is

Dateien

einblenden: Dateien
ausblenden: Dateien
:
arXiv:1705.05020.pdf (Preprint), 542KB
Name:
arXiv:1705.05020.pdf
Beschreibung:
File downloaded from arXiv at 2017-07-05 10:59
OA-Status:
Sichtbarkeit:
Öffentlich
MIME-Typ / Prüfsumme:
application/pdf / [MD5]
Technische Metadaten:
Copyright Datum:
-
Copyright Info:
-

Externe Referenzen

einblenden:

Urheber

einblenden:
ausblenden:
 Urheber:
Laude, Emanuel1, Autor
Lange, Jan-Hendrik2, Autor           
Schmidt, Frank R.1, Autor
Andres, Bjoern2, Autor           
Cremers, Daniel1, Autor
Affiliations:
1External Organizations, ou_persistent22              
2Computer Vision and Multimodal Computing, MPI for Informatics, Max Planck Society, ou_1116547              

Inhalt

einblenden:
ausblenden:
Schlagwörter: Computer Science, Learning, cs.LG
 Zusammenfassung: This paper introduces a novel algorithm for a class of weakly supervised learning tasks. The considered tasks are posed as joint optimization problems in the continuous model parameters and the (a-priori unknown) discrete label variables. In contrast to prior approaches such as convex relaxations, we decompose the nonconvex problem into purely discrete and purely continuous subproblems in a way that is amenable to distributed optimization by the Alternating Direction Method of Multipliers (ADMM). This approach preserves integrality of the discrete label variables and, for a reparameterized variant of the algorithm using kernels, guarantees global convergence to a critical point. The resulting method implicitly alternates between a discrete and a continuous variable update, however, it is inherently different from a discrete-continuous coordinate descent scheme (hard EM). In diverse experiments we show that our method can learn a classifier from weak supervision that takes the form of hard and soft constraints on the labeling and outperforms hard EM in this task.

Details

einblenden:
ausblenden:
Sprache(n): eng - English
 Datum: 2017-05-142017-06-192017
 Publikationsstatus: Online veröffentlicht
 Seiten: 15 p.
 Ort, Verlag, Ausgabe: -
 Inhaltsverzeichnis: -
 Art der Begutachtung: -
 Identifikatoren: arXiv: 1705.05020
URI: http://arxiv.org/abs/1705.05020
BibTex Citekey: DBLP:journals/corr/LaudeLSAC17
 Art des Abschluß: -

Veranstaltung

einblenden:

Entscheidung

einblenden:

Projektinformation

einblenden:

Quelle

einblenden: