作者: J. S. Dehesa , P. Sánchez-Moreno , A. Zarzo , A. Zarzo
DOI: 10.1088/1751-8113/45/12/125305
关键词: Rényi entropy 、 Kullback–Leibler divergence 、 Probability distribution 、 Total correlation 、 Mathematics 、 Jensen–Shannon divergence 、 Fisher information 、 Differential entropy 、 Mathematical analysis 、 Divergence (statistics)
摘要: The measure of Jensen-Fisher divergence between probability distributions is introduced and its theoretical grounds set up. This quantity, in contrast to the remaining Jensen divergences, very sensitive fluctuations because it controlled by (local) Fisher information, which a gradient functional distribution. So, appropriate informative when studying similarity distributions, mainly for those having oscillatory character. new shares with Jensen-Shannon following properties: non-negativity, additivity applied an arbitrary number densities, symmetry under exchange these vanishing if only all densities are equal, definiteness even present non-common zeros. Moreover, shown be expressed terms relative information as does Kullback-Leibler or Shannon entropy. Finally divergences compared three large, non-trivial qualitatively different families distributions: sinusoidal, generalized gamma-like Rakhmanov-Hermite distributions.