作者: Wouter M. Koolen , Tim van Erven
DOI:
关键词: Computer science 、 Set (abstract data type) 、 Structure (mathematical logic) 、 Tracking (particle physics) 、 Scheme (programming language) 、 Interpretation (logic) 、 Hidden Markov model 、 Artificial intelligence 、 Mixing (mathematics)
摘要: A problem posed by Freund is how to efficiently track a small pool of experts out much larger set. This was solved when Bousquet and Warmuth introduced their mixing past posteriors (MPP) algorithm in 2001. In Freund’s the would normally be considered black boxes. However, this paper we re-examine case have internal structure that enables them learn. has two possible interpretations: should learn from all data or only subsequence on which they are being tracked? The MPP solves first case. We generalise address second option. Our results apply any expert can formalised using (expert) hidden Markov models. Curiously enough, for our interpretation there natural reference schemes: freezing sleeping. For each scheme, provide an efficient prediction strategy prove relevant loss bound.