ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Comparison between STDP and Gradient-descent Training Processes for Spiking Neural Networks Using MNIST Digits
Cited 0 time in scopus Download 12 time Share share facebook twitter linkedin kakaostory
Authors
Taewook Kang, Kwang-Il Oh, Jae-Jin Lee, Wangrok Oh
Issue Date
2022-10
Citation
International Conference on Information and Communication Technology Convergence (ICTC) 2022, pp.1732-1734
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1109/ICTC55196.2022.9952721
Project Code
22HS4900, Development of ultra-low power intelligent edge SoC technology based on lightweight RISC-V processor, Koo Bon Tae
Abstract
Spiking neural networks (SNN) operated by the event-driven spiking process can perform rapid inferences with low power consumption, compared to other neural networks. This study presents a comparison between spike-timing-dependent plasticity (STDP)-based unsupervised training and backpropagation-based gradient-descent (BGD) training approaches for the SNNs by the evaluation of a selected MNIST subset.
KSP Keywords
Event-driven, Spike timing dependent plasticity, gradient descent, low power consumption, spiking neural networks, unsupervised training