de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

The entropy regularization information criterion

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84193

Shawe-Taylor J, Schölkopf,  B
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Smola, A., Shawe-Taylor J, Schölkopf, B., & Williamson, R. (2000). The entropy regularization information criterion. Advances in Neural Information Processing Systems, 342-348.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-E4CA-0
Abstract
Effective methods of capacity control via uniform convergence bounds for function expansions have been largely limited to Support Vector machines, where good bounds are obtainable by the entropy number approach. We extend these methods to systems with expansions in terms of arbitrary (parametrized) basis functions and a wide range of regularization methods covering the whole range of general linear additive models. This is achieved by a data dependent analysis of the eigenvalues of the corresponding design matrix.