作者: Maria Giraudo , Laura Sacerdote , Roberta Sirovich
DOI: 10.3390/E15125154
关键词: Statistics 、 Applied mathematics 、 Mathematics 、 Bias of an estimator 、 Consistent estimator 、 Efficient estimator 、 Stein's unbiased risk estimate 、 Invariant estimator 、 Estimator 、 Minimum-variance unbiased estimator 、 Mutual information
摘要: A new, non–parametric and binless estimator for the mutual information of a d–dimensional random vector is proposed. First all, an equation that links to entropy suitable with uniformly distributed components deduced. When d = 2 this reduces well known connection between copula function associated original variables. Hence, problem estimating reduced estimation obtained through multidimensional transformation. The we propose two–step method: first estimate transformation obtain transformed sample, then its entropy. properties new are discussed simulation examples performances compared those best estimators in literature. precision converges values same order magnitude tested. However, unbiased even larger dimensions smaller sample sizes, while other tested show bias these cases.