Inference and Learning in a Latent Variable Model for Beta Distributed Interval Data.

作者: Jörg Lücke , S. Hamid Mousavi , Jakob Drefs , Enrico Guiraud , Mareike Buhl

DOI: 10.3390/E23050552

关键词:

摘要: Latent Variable Models (LVMs) are well established tools to accomplish a range of different data processing tasks. Applications exploit the ability LVMs identify latent structure in order improve (e.g., through denoising) or estimate relation between causes and measurements medical data. In latter case, form noisy-OR Bayes nets represent standard approach relate binary latents (which diseases) observables symptoms). with representation for symptoms may be perceived as coarse approximation, however. practice, real disease can from absent over mild intermediate very severe. Therefore, using diseases/symptoms relations motivation, we here ask how generalized incorporate continuous observables, e.g., variables that model symptom severity an interval healthy pathological. This transition poses number challenges including Bernoulli Beta distribution statistics. While noisy-OR-like approaches constrained determine observables' mean values, use distributions additionally provides (and also requires) variances. To meet emerging when generalizing distributed investigate novel LVM uses maximum non-linearity means variances observables. Given goal likelihood maximization, then leverage recent theoretical results derive Expectation Maximization (EM) algorithm suggested LVM. We further show variational EM used efficiently scale large networks. Experimental finally illustrate efficacy proposed both synthetic sets. Importantly, produces reliable estimating proofs concepts first tests based on images.

参考文章(69)
M. A. Shwe, D. E. Heckerman, M. Henrion, E. J. Horvitz, H. P. Lehmann, G. F. Cooper, B. Middleton, Probabilistic diagnosis using a reformulation of the INTERNIST-1/QMR knowledge base. I. The probabilistic model and inference algorithms. Methods of Information in Medicine. ,vol. 30, pp. 241- 255 ,(1991) , 10.1055/S-0038-1634846
Dacheng Tao, Jun Li, Simple Exponential Family PCA international conference on artificial intelligence and statistics. pp. 453- 460 ,(2010)
Alex Teichman, Andrew Y. Ng, Honglak Lee, Rajat Raina, Exponential family sparse coding with applications to self-taught learning international joint conference on artificial intelligence. pp. 1113- 1119 ,(2009)
Jacquelyn A. Shelton, Jan Gasthaus, Zhenwen Dai, Jörg Lücke, Arthur Gretton, GP-Select: Accelerating EM Using Adaptive Subspace Preselection. Neural Computation. ,vol. 29, pp. 2177- 2202 ,(2017) , 10.1162/NECO_A_00982
L. K. Saul, T. Jaakkola, M. I. Jordan, Mean field theory for sigmoid belief networks Journal of Artificial Intelligence Research. ,vol. 4, pp. 61- 76 ,(1996) , 10.1613/JAIR.251
Miloš Hauskrecht, Tomáš Šingliar, Noisy-OR Component Analysis and its Application to Link Analysis Journal of Machine Learning Research. ,vol. 7, pp. 2189- 2213 ,(2006)
Serafín Moral, Rafael Rumi, Antonio Salmerón, Mixtures of Truncated Exponentials in Hybrid Bayesian Networks european conference on symbolic and quantitative approaches to reasoning and uncertainty. pp. 156- 167 ,(2001) , 10.1007/3-540-44652-4_15
Jacquelyn A. Shelton, Abdul-Saboor Sheikh, Jörg Bornschein, Philip Sterne, Jörg Lücke, Nonlinear Spike-And-Slab Sparse Coding for Interpretable Image Encoding PLOS ONE. ,vol. 10, pp. e0124088- ,(2015) , 10.1371/JOURNAL.PONE.0124088
Yunjin Chen, Thomas Pock, Trainable Nonlinear Reaction Diffusion: A Flexible Framework for Fast and Effective Image Restoration IEEE Transactions on Pattern Analysis and Machine Intelligence. ,vol. 39, pp. 1256- 1272 ,(2017) , 10.1109/TPAMI.2016.2596743
Claes Enøe, Marios P Georgiadis, Wesley O Johnson, Estimation of sensitivity and specificity of diagnostic tests and disease prevalence when the true disease state is unknown. Preventive Veterinary Medicine. ,vol. 45, pp. 61- 81 ,(2000) , 10.1016/S0167-5877(00)00117-3