作者: Kathryn Hempstalk , Eibe Frank , Ian H. Witten
DOI: 10.1007/978-3-540-87479-9_51
关键词:
摘要: One-class classification has important applications such as outlier and novelty detection. It is commonly tackled using density estimation techniques or by adapting a standard algorithm to the problem of carving out decision boundary that describes location target data. In this paper we investigate simple method for one-class combines application estimator, used form reference distribution, with induction model class probability estimation. method, distribution generate artificial data employed second, class. conjunction class, basis two-class learning problem. We explain how function can be combined estimates obtained in way an adjusted estimate Using UCI datasets, from typist recognition problem, show model, consisting both estimator improve on either component technique alone when classification. also compare support vector machines.