作者: Chenggang Zhou , R. N. Bhatt
DOI: 10.1103/PHYSREVE.72.025701
关键词:
摘要: We present a mathematical analysis of the Wang-Landau algorithm, prove its convergence, and identify sources errors strategies for optimization. In particular, we found histogram increases uniformly with small fluctuations after stage initial accumulation, statistical error is to scale as $\sqrt{\mathrm{ln}\phantom{\rule{0.2em}{0ex}}f}$ modification factor $f$. This has implications obtaining fast convergence.