摘要: Measures of divergence are used in many engineering problems such as statistics, mathematical programming, computational vision, and neural networks. The Kullback-Leibler is its typical example which defined between two probability distributions, invariant under information transformations. Bregman another type divergence, often optimization signal processing. This a class divergences having dually flat geometrical structure. Divergence for minimizing discrepancy observed evidences an underlying model. Projection to the model subspace plays fundamental role. Here, geometry important geodesic structure useful, because generalized Pythagorean theorem projection hold.