作者: Anusha Lalitha , Tara Javidi
DOI: 10.1109/ALLERTON.2015.7446979
关键词:
摘要: This paper considers a problem of distributed hypothesis testing and cooperative learning. Individual nodes in network receive noisy local (private) observations whose distribution is parameterized by discrete parameter (hypotheses). The conditional distributions are known locally at the nodes, but true parameter/hypothesis not known. We consider social (“non-Bayesian”) learning rule from previous literature, which first perform Bayesian update their belief (distribution estimate) based on observation, communicate these updates to neighbors, then “non-Bayesian” linear consensus using log-beliefs neighbors. For this rule, we know that under mild assumptions, any node incorrect converges zero exponentially fast, exponential rate characterized structure divergences between observations' distributions. Tight bounds probability deviating nominal aperiodic networks derived. shown hold for all satisfy bounded moment condition.