非表示:
キーワード:
-
要旨:
We address the problem of learning hyperparameters in kernel methods for
which the Hessian of the objective is structured. We propose an approximation
to the cross-validation log likelihood whose gradient can be computed
analytically, solving the hyperparameter learning problem efficiently
through nonlinear optimization. Crucially, our learning method is based
entirely on matrix-vector multiplication primitives with the kernel
matrices and their derivatives, allowing straightforward specialization to
new kernels or to large datasets. When applied to the problem of multi-way
classification, our method scales linearly in the number of classes and
gives rise to state-of-the-art results on a remote imaging task.