作者: Mokshay Madiman , Sergey G. Bobkov
DOI: 10.1214/10-AOP592
关键词: Entropy (information theory) 、 Combinatorics 、 Mathematics 、 Asymptotic equipartition property
摘要: on which the distribution of Xis supported. In this case, eh(X) is essentially number bits neededto represent X by a coding scheme that minimizes average code length([Sha48]). continuous case (with reference measure dx), one may stillcall information content even though interpretation nolonger holds. statistics, think as thelog likelihood function.The value known more com-monly entropy. Indeed, entropy defined byh(X) = −Zf(x)logf(x)dx= −Elogf(X).