Marginal Likelihoods for Distributed Parameter Estimation of Gaussian Graphical Models

作者: Zhaoshi Meng , Dennis Wei , Ami Wiesel , Alfred O. Hero

DOI: 10.1109/TSP.2014.2350956

关键词: GaussianMathematical optimizationGraphical modelEstimatorRate of convergenceMaximum likelihood sequence estimationCovariance matrixEstimation of covariance matricesEstimation theoryApplied mathematicsMathematics

摘要: We consider distributed estimation of the inverse co- variance matrix, also called concentration or precision in Gaussian graphical models. Traditional centralized often requires global inference covariance which can be computationally intensive large dimensions. Approximate in- ference based on message-passing algorithms, other hand, lead to unstable and biased loopy Here, we propose a general framework for maximum margina ll ikelihood (MML) approach. This approach compute sl ocal parameter estimates by maximizing marginal likelihoods defined with respect data col- lected from local neighborhoods. Due non-convexity MML problem, introduce solve convex relaxation. The are then combined into estimate without need iterative between proposed algorithm is naturally parallelizable computa- tionally efficient, thereby making it suitable high-dimensional problems. In classical regime where number variables fixed samplesincreases infinity, estimator shown asymptotically consistent improve monotonically as neighborhood size increases. scaling bothandincrease convergence rate true parameters derived seen comparable maximum-likelihood estimation. Extensive numerica le xperiments demonstrate improved performance two-hop version estimator, suffices almost close gap likelihood at reduced computational cost.

参考文章(31)
Ami Wiesel, Zhaoshi Meng, Alfred O. Hero, Dennis L. Wei, Distributed Learning of Gaussian Graphical Models via Marginal Likelihoods international conference on artificial intelligence and statistics. pp. 39- 47 ,(2013)
Nir Friedman, Daniel L. Koller, Probabilistic graphical models : principles and techniques The MIT Press. ,(2009)
James Franklin, The elements of statistical learning : data mining, inference,and prediction The Mathematical Intelligencer. ,vol. 27, pp. 83- 85 ,(2005) , 10.1007/BF02985802
Yair Weiss, Kevin P. Murphy, Michael I. Jordan, Loopy belief propagation for approximate inference: an empirical study uncertainty in artificial intelligence. pp. 467- 475 ,(1999)
S. Milgram, The Small World Problem Psychology today. ,vol. 1, pp. 60- 67 ,(1967)
Adam J. Rothman, Elizaveta Levina, Peter J. Bickel, Ji Zhu, Sparse permutation invariant covariance estimation Electronic Journal of Statistics. ,vol. 2, pp. 494- 515 ,(2008) , 10.1214/08-EJS176
Chengjing Wang, Defeng Sun, Kim-Chuan Toh, Solving Log-Determinant Optimization Problems by a Newton-CG Primal Proximal Point Algorithm Siam Journal on Optimization. ,vol. 20, pp. 2994- 3013 ,(2010) , 10.1137/090772514
Nicolai Meinshausen, Peter Bühlmann, High-dimensional graphs and variable selection with the Lasso Annals of Statistics. ,vol. 34, pp. 1436- 1462 ,(2006) , 10.1214/009053606000000281
Zhaoshi Meng, Dennis Wei, Alfred O. Hero, Ami Wiesel, Marginal likelihoods for distributed estimation of graphical model parameters ieee international workshop on computational advances in multi sensor adaptive processing. pp. 73- 76 ,(2013) , 10.1109/CAMSAP.2013.6714010
Ami Wiesel, Alfred O. Hero, Distributed Covariance Estimation in Gaussian Graphical Models IEEE Transactions on Signal Processing. ,vol. 60, pp. 211- 220 ,(2012) , 10.1109/TSP.2011.2172430