On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices.

作者: Dongseok Kwon , Suhwan Lim , Jong-Ho Bae , Sung-Tae Lee , Hyeongsu Kim

DOI: 10.3389/FNINS.2020.00423

关键词:

摘要: Hardware-based spiking neural networks (SNNs) inspired by a biological nervous system are regarded as an innovative computing with very low power consumption and massively parallel operation. To train SNNs supervision, we propose efficient on-chip training scheme approximating backpropagation algorithm suitable for hardware implementation. We show that the accuracy of proposed is close to conventional artificial (ANNs) using stochastic characteristics neurons. In configuration, gated Schottky diodes (GSDs) used synaptic devices, which have saturated current respect input voltage. design SNN GSDs, can update their conductance in speed up overall system. The performance validated through MNIST data set classification based on network size total time step. systems achieve 97.83% 1 hidden layer 98.44% 4 layers fully connected networks. then evaluate effect non-linearity asymmetry response long-term potentiation (LTP) depression (LTD) addition, impact device variations evaluated.

参考文章(47)
Peter U. Diehl, Daniel Neil, Jonathan Binas, Matthew Cook, Shih-Chii Liu, Michael Pfeiffer, Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing international joint conference on neural network. pp. 1- 8 ,(2015) , 10.1109/IJCNN.2015.7280696
Geoffrey W. Burr, Robert M. Shelby, Severin Sidler, Carmelo di Nolfo, Junwoo Jang, Irem Boybat, Rohit S. Shenoy, Pritish Narayanan, Kumar Virwani, Emanuele U. Giacometti, Bulent N. Kurdi, Hyunsang Hwang, Experimental Demonstration and Tolerancing of a Large-Scale Neural Network (165 000 Synapses) Using Phase-Change Memory as the Synaptic Weight Element IEEE Transactions on Electron Devices. ,vol. 62, pp. 3498- 3507 ,(2015) , 10.1109/TED.2015.2439635
Peter O'Connor, Daniel Neil, Shih-Chii Liu, Tobi Delbruck, Michael Pfeiffer, Real-time classification and sensor fusion with a spiking deep belief network Frontiers in Neuroscience. ,vol. 7, pp. 178- 178 ,(2013) , 10.3389/FNINS.2013.00178
Sung Hyun Jo, Ting Chang, Idongesit Ebong, Bhavitavya B. Bhadviya, Pinaki Mazumder, Wei Lu, Nanoscale Memristor Device as Synapse in Neuromorphic Systems Nano Letters. ,vol. 10, pp. 1297- 1301 ,(2010) , 10.1021/NL904092H
Doo-Hyun Kim, Seongjae Cho, Dong Hua Li, Jang-Gn Yun, Jung Hoon Lee, Gil Sung Lee, Yoon Kim, Won Bo Shim, Se Hwan Park, Wandong Kim, Hyungcheol Shin, Byung-Gook Park, Program/Erase Model of Nitride-Based NAND-Type Charge Trap Flash Memories Japanese Journal of Applied Physics. ,vol. 49, pp. 084301- ,(2010) , 10.1143/JJAP.49.084301
Ilya Sutskever, Geoffrey Hinton, Alex Krizhevsky, Ruslan Salakhutdinov, Nitish Srivastava, Dropout: a simple way to prevent neural networks from overfitting Journal of Machine Learning Research. ,vol. 15, pp. 1929- 1958 ,(2014)
Damien Querlioz, Philippe Dollfus, Olivier Bichler, Christian Gamrat, Learning with memristive devices: How should we model their behavior? international symposium on nanoscale architectures. pp. 150- 156 ,(2011) , 10.1109/NANOARCH.2011.5941497
Maximilian Riesenhuber, Tomaso Poggio, Hierarchical models of object recognition in cortex. Nature Neuroscience. ,vol. 2, pp. 1019- 1025 ,(1999) , 10.1038/14819