ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Activation Functions of Deep Neural Networks for Polar Decoding Applications
Cited 9 time in scopus Download 1 time Share share facebook twitter linkedin kakaostory
Authors
Jihoon Seo, Juyul Lee, Keunyoung Kim
Issue Date
2017-10
Citation
International Symposium on Personal, Indoor, and Mobile Radio Communication (PIMRC) 2017, pp.1-5
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1109/PIMRC.2017.8292678
Project Code
17ZF1100, Wireless Transmission Technolgy in Multi-point to Multi-point Communications, Keunyoung Kim
Abstract
Among various deep neural network (DNN) components, this paper studies the activation functions especially for deep feed-forward networks with applications to channel decoding problems of polar code. In line with our previous study, this paper considers the ReLU (Rectified Linear Unit) and its variants for activation functions of DNN. We devise a new ReLU variant, called Sloped ReLU, by varying the slope of the ReLU for the positive domain range. This is analogous to tree architectures between the likelihood function in successive decoding of channel codes and the activation function in DNN. Our numerical results show that the polar decoding performance with the Sloped ReLU improves as the slope increases, up to a certain level. We believe that the idea of utilizing this analogy for determining activation functions of DNN can be applied to other decoding problems as well, which remains as a future work.
KSP Keywords
Activation function, Channel decoding, Deep neural network(DNN), Feed-Forward, Numerical results, Polar codes, Rectified linear unit, Successive decoding, decoding performance, likelihood function