作者: Conrad C. Galland , Geoffrey E. Hinton
DOI: 10.1016/B978-1-4832-1448-1.50006-8
关键词:
摘要: Abstract The simplicity and locality of the “contrastive Hebb synapse” (CHS) used in Boltzmann machine learning makes it an attractive model for real biological synapses. slow exhibited by stochastic can be greatly improved using a mean field approximation has been shown (Hinton, 1989) that CHS also performs steepest descent these deterministic networks. A major weakness procedure, from perspective, is derivation assumes detailed symmetry connectivity. Using networks with purely asymmetric connectivity, we show still works practice provided connectivity grossly symmetrical so if unit i sends connection to j, there are numerous indirect feedback paths j i. So long as network settles stable state, approximates proportional error expected decrease size increases.