作者: Jose C. Principe , Dongxin Xu
DOI:
关键词:
摘要: The major goal of this research is to develop general nonparametric methods for the estimation entropy and mutual information, giving a unifying point view their use in signal processing neural computation. In many real world problems, information carried solely by data samples without any other priori knowledge. central issue “learning from examples” estimate energy, or variable only its adapt system parameters optimizing criterion based on estimation. By using alternative measures such as Renyi's quadratic entropy, coupled with Parzen window probability density function samples, we developed an “information potential” method method, are treated physical particles turns out be related potential energy these particles.” maximization minimization then equivalent potential.” Based Cauchy-Schwartz inequality Euclidean distance metric, further proposed Shannon's information. There also “cross implementation that correlation between “marginal potentials” at several levels. “Learning output mapper implemented propagating force” back parameters. Since criteria decoupled structure learning machines, they schemes. provide microscopic expression macroscopic measure sample level. algorithms examine relative position each pair thus have computational complexity O(N2). An on-line local algorithm discussed, where field famous biological Hebbian anti-Hebbian rules. understanding, generalized eigendecomposition proposed. The been successfully applied various problems aspect angle synthetic aperture radar (SAR) imagery, target recognition SAR layer-by-layer training multilayer networks blind source separation. good performance confirms validity efficiency methods.