作者: John Langford , Alina Beygelzimer
DOI: 10.1007/11503415_11
关键词:
摘要: We present a reduction from cost-sensitive classification to binary based on (a modification of) error correcting output codes. The satisfies the property that e regret for implies l2-regret of at most 2e cost estimation. This has several implications: Any regret-minimizing online algorithm 0/1 loss is (via reduction) algorithm. In particular, this means learning can be made work arbitrary (i.e., totally unstructured) functions. The thresholded so 4${\sqrt{\epsilon Z}}$ where Z expected sum costs. For multiclass problems, translates into 4e in estimation class probabilities. For classification, 4${\sqrt{\epsilon}}$ regret.