作者: Peter L. Bartlett , Olivier Bousquet , Shahar Mendelson
DOI: 10.1214/009053605000000282
关键词:
摘要: We propose new bounds on the error of learning algorithms in terms a data-dependent notion complexity. The estimates we establish give optimal rates and are based local empirical version Rademacher averages, sense that averages computed from data, subset functions with small error. present some applications to classification prediction convex function classes, kernel classes particular.