作者: Aleks Jakulin
DOI:
关键词: Discretization 、 Machine learning 、 Applied mathematics 、 Artificial intelligence 、 Probabilistic logic 、 Statistical model 、 Finite mixture 、 Data structure 、 Entropy (information theory) 、 Mathematics 、 Correlation 、 Continuous data
摘要: Entropy and information are common measures of probabilistic models data, frequently used for discrete discretized but more rarely continuous data. We employ finite mixture models, which handle data simultaneously, to construct a model whose entropy can be estimated. Analytic estimates intractable some so an approximate sample estimate is used. A simplified formulation bootstrap employed assess the distribution entropy, then represented with confidence intervals. show how as unified approach quantifying three fundamental qualitative aspects models: correlation aspect that corresponds linearities in model, structure helps capture model’s nonlinearities, interaction underlying entanglements attributes resulting from or both.