ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술대회 An Empirical Study on Channel Pruning through Residual Connections
Cited 0 time in scopus Download 5 time Share share facebook twitter linkedin kakaostory
저자
이종률, 문용혁
발행일
202110
출처
International Conference on Information and Communication Technology Convergence (ICTC) 2021, pp.1380-1382
DOI
https://dx.doi.org/10.1109/ICTC52510.2021.9620978
협약과제
21HS7200, 능동적 즉시 대응 및 빠른 학습이 가능한 적응형 경량 엣지 연동분석 기술개발, 문용혁
초록
Channel pruning is one of promising techniques for compressing convolutional neural networks due to its usefulness and runtime-agnostic efficiency. Most existing works focus on determining which channel should be removed or kept for only sequentially connected convolution layers inside a residual block due to the hardness of channel pruning through residual connections. Motivated by this, we focus on investigating the layer-wise dynamics of channel pruning through residual connections, and propose simple yet effective pruning methods. These methods do not require any additional training data to compute the importance of channels affected by residual connections. In experiments, we demonstrate that the proposed methods have promising performance despite their simplicity and efficiency. In addition, we achieve interesting observations about the dynamics of channel pruning on layers at different topological positions, which are related to the consistency of layers' filters for evaluating the importance of each out-channel.
KSP 제안 키워드
Convolution neural network(CNN), Empirical study, Pruning method, training data