ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술대회 Activation Functions of Deep Neural Networks for Polar Decoding Applications
Cited 9 time in scopus Download 1 time Share share facebook twitter linkedin kakaostory
저자
서지훈, 이주열, 김근영
발행일
201710
출처
International Symposium on Personal, Indoor, and Mobile Radio Communication (PIMRC) 2017, pp.1-5
DOI
https://dx.doi.org/10.1109/PIMRC.2017.8292678
협약과제
17ZF1100, 다점대다점 환경에서 이론적 한계도달을 위한 무선전송 기술 개발, 김근영
초록
Among various deep neural network (DNN) components, this paper studies the activation functions especially for deep feed-forward networks with applications to channel decoding problems of polar code. In line with our previous study, this paper considers the ReLU (Rectified Linear Unit) and its variants for activation functions of DNN. We devise a new ReLU variant, called Sloped ReLU, by varying the slope of the ReLU for the positive domain range. This is analogous to tree architectures between the likelihood function in successive decoding of channel codes and the activation function in DNN. Our numerical results show that the polar decoding performance with the Sloped ReLU improves as the slope increases, up to a certain level. We believe that the idea of utilizing this analogy for determining activation functions of DNN can be applied to other decoding problems as well, which remains as a future work.
KSP 제안 키워드
Activation function, Channel decoding, Deep neural network(DNN), Feed-Forward, Numerical results, Polar codes, Rectified linear unit, Successive decoding, decoding performance, likelihood function