ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Improvement in Deep Networks for Optimization Using eXplainable Artificial Intelligence
Cited 7 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Jin ha Lee, Ik hee Shin, Sang gu Jeong, Seung-Ik Lee, Muhamamad Zaigham Zaheer, Beom-Su Seo
Issue Date
2019-10
Citation
International Conference on Information and Communication Technology Convergence (ICTC) 2019, pp.525-530
Publisher
IEEE
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1109/ICTC46691.2019.8939943
Abstract
With the recent advancements in the field of explainable artificial intelligence (XAI), which covers the domain of turning deep network architectures from black boxes to comprehensible structures, it became easy to understand what goes on inside a network when it predicts an output. Many researchers have successfully shown the 'thought process' behind a network's decision making. This rich and interesting information has not been utilized beyond the scope of visualizations once the training finish. In this work, a novel idea to utilize this insight to the network as a training parameter is proposed. Layer-wise Relevance Propagation (LRP), which obtains the effect of each neuron towards the output of the whole network, is used as a parameter, along with learning rate and network weights, to optimize the training. Various intuitive formulations have been proposed, and the results of the experiments on MNIST and CIFAR-10 datasets have been reported in this paper. Our proposed methodologies show better or comparable performances against conventional optimization algorithms. This would open a new dimension of research to explore the possibility of using XAI in optimizing the training of neural networks.
KSP Keywords
CIFAR-10, Learning rate, Network Architecture, Optimization algorithm, Thought process, artificial intelligence, decision making, deep networks, neural network