作者: Richard Nock , Frank Nielsen
DOI: 10.1007/11564096_65
关键词: Support vector machine 、 One-class classification 、 Euclidean geometry 、 Combinatorics 、 Bregman divergence 、 Approximation algorithm 、 Ball (mathematics) 、 Euclidean space 、 Support point 、 Mathematics
摘要: Finding a point which minimizes the maximal distortion with respect to dataset is an important estimation problem that has recently received growing attentions in machine learning, advent of one class classification. We propose two theoretically founded generalizations arbitrary Bregman divergences, recent popular smallest enclosing ball approximation algorithm for Euclidean spaces coined by Bădoiu and Clarkson 2002.