作者: Olivier Bousquet , Manfred K. Warmuth
关键词: Binary logarithm 、 Partition (database) 、 Weight 、 Concept drift 、 Open problem 、 Sequence 、 Small set 、 Computer science 、 Algorithm 、 Encoding (memory)
摘要: In this paper, we examine on-line learning problems in which the target concept is allowed to change over time. each trial a master algorithm receives predictions from large set of n experts. Its goal predict almost as well best sequence such experts chosen off-line by partitioning training into k+1 sections and then choosing expert for section. We build on methods developed Herbster Warmuth consider an open problem posed Freund where partition are small pool size m. Since k >> m, shifts back forth between pool. propose algorithms that solve mixing past posteriors maintained algorithm. relate number bits needed encoding loss bounds algorithms. Instead paying log section first pay (n choose m) identifying m per new also twice boundaries sections.