Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Bericht

Infinite Kernel Learning

MPG-Autoren
/persons/resource/persons44483

Gehler,  PV
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84113

Nowozin,  S
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine externen Ressourcen hinterlegt
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)

MPIK-TR-178.pdf
(Verlagsversion), 294KB

Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Gehler, P., & Nowozin, S.(2008). Infinite Kernel Learning (178). Tübingen, Germany: Max Planck Institute for Biological Cybernetics.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-0013-C6CD-D
Zusammenfassung
In this paper we consider the problem of automatically learning the kernel from general kernel
classes. Specifically we build upon the Multiple Kernel Learning (MKL) framework and in particular on the work
of (Argyriou, Hauser, Micchelli, Pontil, 2006). We will formulate a Semi-Infinite Program (SIP) to solve the
problem and devise a new algorithm to solve it (Infinite Kernel Learning, IKL). The IKL algorithm is applicable
to both the finite and infinite case and we find it to be faster and more stable than SimpleMKL (Rakotomamonjy,
Bach, Canu, Grandvalet, 2007) for cases of many kernels. In the second part we present the first large scale
comparison of SVMs to MKL on a variety of benchmark datasets, also comparing IKL. The results show two
things: a) for many datasets there is no benefit in linearly combining kernels with MKL/IKL instead of the SVM
classifier, thus the flexibility of using more than one kernel seems to be of no use, b) on some datasets IKL yields
impressive increases in accuracy over SVM/MKL due to the possibility of using a largely increased kernel set. In
those cases, IKL remains practical, whereas both cross-validation or standard MKL is infeasible.