作者: Colin Sandon , Emmanuel Abbe
DOI:
关键词: Mathematical optimization 、 Universality (dynamical systems) 、 Model parameters 、 Stochastic block model 、 Scaling 、 Logarithm 、 Efficient algorithm 、 Mathematics 、 Weak consistency
摘要: Most recent developments on the stochastic block model (SBM) rely knowledge of parameters, or at least number communities. This paper introduces efficient algorithms that do not require such and yet achieve optimal information-theoretic tradeoffs identified in [AS15] for linear size The results are three-fold: (i) constant degree regime, an algorithm is developed requires only a lower-bound relative sizes communities detects with accuracy scaling large degrees; (ii) regime where degrees scaled by $\omega(1)$ (diverging degrees), this enhanced into fully agnostic takes graph question simultaneously learns parameters (including communities) $1-o(1)$, overall quasi-linear complexity; (iii) logarithmic achieves CH-limit exact recovery, time. These provide first affording efficiency, universality optimality strong weak consistency general SBM