Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse




Conference Paper

Learning output kernels with block coordinate descent


Dinuzzo,  F.
Dept. Empirical Inference, Max Planck Institute for Intelligent Systems, Max Planck Society;

Ong,  C. S.
Max Planck Society;

Gehler,  P. V.
Dept. Perceiving Systems, Max Planck Institute for Intelligent Systems, Max Planck Society;

There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available

Dinuzzo, F., Ong, C. S., Gehler, P. V., & Pillonetto, G. (2011). Learning output kernels with block coordinate descent. In L. Gerloor, & T. Scheffer (Eds.), Proceedings of the 28th International Conference on Machine Learning (pp. 49-56).

Cite as:
We propose a method to learn simultaneously a vector-valued function and a kernel between its components. The obtained kernel can be used both to improve learning performances and to reveal structures in the output space which may be important in their own right. Our method is based on the solution of a suitable regularization problem over a reproducing kernel Hilbert space (RKHS) of vector-valued functions. Although the regularized risk functional is non-convex, we show that it is invex, implying that all local minimizers are global minimizers. We derive a block-wise coordinate descent method that efficiently exploits the structure of the objective functional. Then, we empirically demonstrate that the proposed method can improve classification accuracy. Finally, we provide a visual interpretation of the learned kernel matrix for some well known datasets.