ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper On the Hardness of Pruning NASNet
Cited 0 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Jong-Ryul Lee, Yong-Hyuk Moon
Issue Date
2022-10
Citation
International Conference on Information and Communication Technology Convergence (ICTC) 2022, pp.1897-1899
Publisher
IEEE
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1109/ICTC55196.2022.9952781
Abstract
NASNet is one of the famous convolutional neural networks generated by a neural architecture search algorithm. Its topological structure consists of skip connections like ResNet and DenseNet, but it also has a different aspect, which is called inner channel coupling. The inner channel coupling is a big obstacle to pruning internal channels in NASNet, so NASNet was not studied in the literature for channel pruning. Motivated by this, we present what the inner channel coupling is and why it happens for NASNet. Then, we analyze how it makes pruning NASNet especially hard. Finally, we conduct experiments to explore the changes of NASNet over different pruning ratios. To the best of our knowledge, this is the first work for introducing the inner channel coupling and channel pruning on NASNet.
KSP Keywords
Channel coupling, Convolution neural network(CNN), Different aspect, Search Algorithm(GSA), neural network(NN), skip connections, topological structure