作者: Shai Shalev-Shwartz , Nicolò Cesa-Bianchi , Ohad Shamir
DOI:
关键词: Online machine learning 、 Gaussian function 、 Noise (video) 、 Algorithm 、 Artificial intelligence 、 Estimator 、 Function (mathematics) 、 Polynomial 、 Pattern recognition 、 Bounded function 、 Gradient descent 、 Computer science
摘要: We study online learning when individual instances are corrupted by adversarially chosen random noise. assume the noise distribution is unknown, and may change over time with no restriction other than having zero mean bounded variance. Our technique relies on a family of unbiased estimators for non-linear functions, which be independent interest. show that variant gradient descent can learn functions in any dot-product (e.g., polynomial) or Gaussian kernel space analytic convex loss function. uses randomized estimates need to query number noisy copies each instance, where high probability this upper constant. Allowing such multiple queries cannot avoided: Indeed, we general impossible only one copy instance accessed.