作者: A. Ben Hamza , Hamid Krim
DOI: 10.1007/978-3-540-45063-4_10
关键词:
摘要: Information theoretic measures provide quantitative entropic divergences between two probability distributions or data sets. In this paper, we analyze the theoretical properties of Jensen-Renyi divergence which is defined any arbitrary number distributions. Using theory majorization, derive its maximum value, and also some performance upper bounds in terms Bayes risk asymptotic error nearest neighbor classifier. To gain further insight into robustness application measure imaging, substantial numerical experiments to show power entopic image registration segmentation.