ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Empirical Evaluation of Activation Functions and Kernel Initializers on Deep Reinforcement Learning
Cited 6 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Sooyoung Jang, Youngsung Son
Issue Date
2019-10
Citation
International Conference on Information and Communication Technology Convergence (ICTC) 2019, pp.1140-1142
Publisher
IEEE
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1109/ICTC46691.2019.8939854
Abstract
Hyperparameter optimization has a considerable impact on system performance. However many researchers use the default hyperparameter configurations provided by the machine learning frameworks they use because it require a lot of effort, time and computing resources. In this paper, we analyze the effect of the activation function and the kernel initializer on the performance of deep reinforcement learning. In detail, we compared the performance of 42 combinations of 7 activation functions and 6 kernel initializers on two different network types, namely, multi-layer perceptron (MLP) and convolutional neural network (CNN). From the results, it has been revealed that not only did the optimal combinations differ but also, even with the same hyperparameter, the performance difference was dramatic depending on the network type. This phenomenon was prominent in the kernel initializer.