作者: Jayakrishnan Unnikrishnan , Dayu Huang , Sean Meyn , Venu Veeravalli , Amit Surana
DOI:
关键词:
摘要: Optimal solution to the universal hypothesis testing problem suffers from high variance for large alphabet distributions. We propose a new approach to this problem that addresses this issue. Our solution is based on the mismatched divergence which is a new lower bound on Kullback-Leibler divergence (ie, relative entropy). We present results on the asymptotic statistics of our test statistic and geometry of our mismatched test.