作者: Ilias Diakonikolas , Daniel M. Kane , Alistair Stewart
DOI:
关键词:
摘要: An $(n, k)$-Poisson Multinomial Distribution (PMD) is a random variable of the form $X = \sum_{i=1}^n X_i$, where $X_i$'s are independent vectors supported on set standard basis in $\mathbb{R}^k.$ In this paper, we obtain refined structural understanding PMDs by analyzing their Fourier transform. As our core result, prove that transform {\em approximately sparse}, i.e., roughly speaking, its $L_1$-norm small outside set. By building following applications: {\bf Learning Theory.} We design first computationally efficient learning algorithm for with respect to total variation distance. Our learns an arbitrary k)$-PMD within distance $\epsilon$ using near-optimal sample size $\widetilde{O}_k(1/\epsilon^2),$ and runs time $\widetilde{O}_k(1/\epsilon^2) \cdot \log n.$ Previously, no $\mathrm{poly}(1/\epsilon)$ runtime was known, even $k=3.$ Game give polynomial-time approximation scheme (EPTAS) computing Nash equilibria anonymous games. For normalized games $n$ players $k$ strategies, computes well-supported $\epsilon$-Nash equilibrium $n^{O(k^3)} (k/\epsilon)^{O(k^3\log(k/\epsilon)/\log\log(k/\epsilon))^{k-1}}.$ The best previous problem had running $n^{(f(k)/\epsilon)^k},$ $f(k) \Omega(k^{k^2})$, any $k>2.$ Statistics.} multivariate central limit theorem (CLT) relates PMD discretized Gaussian same mean covariance, new CLT strengthens Valiant completely removing dependence error bound.