作者: Vassilis P. Plagianakos , Michael N. Vrahatis
关键词:
摘要: In this paper, Parallel Evolutionary Algorithms for integer weight neural network training are presented. To end, each processor is assigned a subpopulation of potential solutions. The subpopulations independently evolved in parallel and occasional migration employed to allow cooperation between them. proposed algorithms applied train networks using threshold activation functions values confined narrow band integers. We constrain the weights biases range [−3, 3], thus they can be represented by just 3 bits. Such better suited hardware implementation than real ones. These have been designed keeping mind that resulting require less bits stored digital arithmetic operations them easier implemented hardware. Another advantage evolutionary strategies capable continuing process ``on-chip'', if needed. Our intention present results on difficult task. Based application class methods classical problems, our experience these effective reliable.