作者: Radford M. Neal , Michael Harvey
DOI:
关键词:
摘要: Inference for belief networks using Gibbs sampling produces a distribution unobserved variables that differs from the correct by (usually) unknown error, since convergence to right occurs only asymptotically. The method of "coupling past" samples exactly (conceptually) running dependent simulations every possible starting state time far enough in past all runs reach same at t = 0. Explicitly considering is intractable large networks, however. We propose layered noisy-or uses compact, but often imprecise, summary set states. This distribution, and requires about twice per step as ordinary sampling, it may require more simulation steps than would be needed if chains were tracked exactly.