ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술대회 On the Hardness of Pruning NASNet
Cited 0 time in scopus Download 6 time Share share facebook twitter linkedin kakaostory
저자
이종률, 문용혁
발행일
202210
출처
International Conference on Information and Communication Technology Convergence (ICTC) 2022, pp.1897-1899
DOI
https://dx.doi.org/10.1109/ICTC55196.2022.9952781
협약과제
22HS2700, 능동적 즉시 대응 및 빠른 학습이 가능한 적응형 경량 엣지 연동분석 기술개발, 문용혁
초록
NASNet is one of the famous convolutional neural networks generated by a neural architecture search algorithm. Its topological structure consists of skip connections like ResNet and DenseNet, but it also has a different aspect, which is called inner channel coupling. The inner channel coupling is a big obstacle to pruning internal channels in NASNet, so NASNet was not studied in the literature for channel pruning. Motivated by this, we present what the inner channel coupling is and why it happens for NASNet. Then, we analyze how it makes pruning NASNet especially hard. Finally, we conduct experiments to explore the changes of NASNet over different pruning ratios. To the best of our knowledge, this is the first work for introducing the inner channel coupling and channel pruning on NASNet.
KSP 제안 키워드
Channel coupling, Convolution neural network(CNN), Different aspect, Search Algorithm(GSA), skip connections, topological structure