English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Learning low-rank output kernels

MPS-Authors
/persons/resource/persons83886

Dinuzzo,  F
Dept. Empirical Inference, Max Planck Institute for Intelligent Systems, Max Planck Society;

External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Dinuzzo, F., & Fukumizu, K. (2011). Learning low-rank output kernels. In C.-N. Hsu, & W. Lee (Eds.), Asian Conference on Machine Learning, 14-15 November 2011, South Garden Hotels and Resorts, Taoyuan, Taiwain (pp. 181-196). Cambridge, MA, USA: JMLR.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-B922-E
Abstract
Output kernel learning techniques allow to simultaneously learn a vector-valued function and a positive semidefinite matrix which describes the relationships between the outputs. In this paper, we introduce a new formulation that imposes a low-rank constraint on the output kernel and operates directly on a factor of the kernel matrix. First, we investigate the connection between output kernel learning and a regularization problem for an architecture with two layers. Then, we show that a variety of methods such as nuclear norm regularized regression, reduced-rank regression, principal component analysis, and low rank matrix approximation can be seen as special cases of the output kernel learning framework. Finally, we introduce a block coordinate descent strategy for learning low-rank output kernels.