ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Journal Article Adaptive Network Compression using Block Removal and Recycling Mechanisms
Cited - time in scopus Share share facebook twitter linkedin kakaostory
Authors
Kimin Yun, Jongwon Choi
Issue Date
2021-11
Citation
TECHART: Journal of Arts and Imaging Science, v.8, no.4, pp.41-45
ISSN
2288-9248
Publisher
중앙대학교 영상콘텐츠융합연구소
Language
English
Type
Journal Article
DOI
https://dx.doi.org/10.15323/techart.2021.11.8.4.41
Abstract
Recent advancements in deep learning has improved the performance of various visual tasks; however, the inference of deep learning models requires large computations, where expensive GPUs are essential. In this study, we propose a novel scheme to compress the computation and storage memories of deep learning models, even without the supervision of the testing environment. To obtain general robustness across various domains, the proposed algorithm trains the deep learning model using two novel mechanisms: block removal and block recycling. In addition, the trained network can robustly function even after compression. We validated the classification performance of the proposed algorithm using CIFAR10. The obtained results proved impressive performance of the presented algorithm with reasonable compression rates.
KSP Keywords
Classification Performance, adaptive network, deep learning(DL), deep learning models