de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

An Efficient Method for Gradient-Based Adaptation of Hyperparameters in SVM Models

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83855

Chapelle,  O
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Keerthi, S., Sindhwani, V., & Chapelle, O. (2007). An Efficient Method for Gradient-Based Adaptation of Hyperparameters in SVM Models. Advances in Neural Information Processing Systems 19: Proceedings of the 2006 Conference, 673-680.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-CBD3-F
Abstract
We consider the task of tuning hyperparameters in SVM models based on minimizing a smooth performance validation function, e.g., smoothed k-fold cross-validation error, using non-linear optimization techniques. The key computation in this approach is that of the gradient of the validation function with respect to hyperparameters. We show that for large-scale problems involving a wide choice of kernel-based models and validation functions, this computation can be very efficiently done; often within just a fraction of the training time. Empirical results show that a near-optimal set of hyperparameters can be identified by our approach with very few training rounds and gradient computations.