作者: B.D. Brown , H.C. Card
DOI: 10.1109/12.954505
关键词: Artificial neural network 、 Finite-state machine 、 Models of neural computation 、 Binary number 、 Nonlinear system 、 Stochastic neural network 、 Multiplication 、 Competitive learning 、 Algorithm 、 Stochastic computing 、 Computation 、 Sigmoid function 、 Computer science
摘要: This paper examines a number of stochastic computational elements employed in artificial neural networks, several which are introduced for the first time, together with an analysis their operation. We briefly include multiplication, squaring, addition, subtraction, and division circuits both unipolar bipolar formats, principles well-known, at least signals. have modifications to improve speed The primary contribution this paper, however, is introducing state machine-based performing sigmoid nonlinearity mappings, linear gain, exponentiation functions. also describe efficient method generation of, conversion between, deterministic binary validity present approach demonstrated companion through sample application, recognition noisy optical characters using soft competitive learning. Network generalization capabilities network maintain squared error within 10 percent that floating-point implementation wide range noise levels. While accuracy computation may not compare favorably more conventional radix-based computation, low circuit area, power, characteristics may, certain situations, make them attractive VLSI networks.