English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Single-class Support Vector Machines

MPS-Authors
There are no MPG-Authors in the publication available
External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Schölkopf, B., Williamson, R., Smola, A., & Shawe-Taylor, J. (1999). Single-class Support Vector Machines. Poster presented at Dagstuhl-Seminar 99121: Unsupervised Learning, Dagstuhl, Germany.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-E79A-2
Abstract
Suppose you are given some dataset drawn from an underlying probability dis-tributionPand you want to estimate a subsetSof input space such that theprobability that a test point drawn fromPlies outside ofSis bounded by somea priori specified 0< ν≤1.We propose an algorithm to deal with this problem by trying toestimate afunctionfwhich is positive onSand negative on the complement ofS. Thefunctional form offis given by a kernel expansion in terms of a potentially smallsubset of the training data; it is regularized by controlling the length of the weightvector in an associated feature space.We can prove thatνupper bounds the fraction of outliers (training points outsideofS) and lower bounds the fraction of support vectors. Asymptotically, undersome mild condition onP, both become equalities.The algorithm is a natural extension of the support vector algorithm to the case of unlabelled data.