English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Regularizing AdaBoost

MPS-Authors
There are no MPG-Authors in the publication available
External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Rätsch, G., Onoda, T., & Müller, K.-R. (1999). Regularizing AdaBoost. In M. Kearns, S. Solla, & D. Cohn (Eds.), Advances in Neural Information Processing Systems 11 (pp. 564-570). Cambridge, MA, USA: MIT Press.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-E69B-8
Abstract
Boosting methods maximize a hard classification margin and are known as powerful techniques that do not exhibit overfitting for low noise cases. Also for noisy data boosting will try to enforce a hard margin and thereby give too much weight to outliers, which then leads to the dilemma of non-smooth fits and overfitting. Therefore we propose three algorithms to allow for soft margin classification by introducing regularization with slack variables into the boosting concept: (1) AdaBoost reg and regularized versions of (2) linear and (3) quadratic programming AdaBoost. Experiments show the usefulness of the proposed algorithms in comparison to another soft margin classifier: the support vector machine.