hide
Free keywords:
-
Abstract:
We propose fast algorithms for reducing the number of kernel evaluations in the testing
phase for methods such as Support Vector Machines (SVM) and Ridge Regression (RR). For
non-sparse methods such as RR this results in significantly improved prediction time.
For binary SVMs, which are already sparse in their expansion, the pay off is mainly in
the cases of noisy or large-scale problems. However, we then further develop our method
for multi-class problems where, after choosing the expansion to find vectors which
describe all the hyperplanes jointly, we again achieve significant gains.