ETRI-Knowledge Sharing Plaform



논문 검색
구분 SCI
연도 ~ 키워드


학술대회 LazyNet: Lazy Entry Neural Networks for Accelerated and Efficient Inference
Cited 0 time in scopus Download 5 time Share share facebook twitter linkedin kakaostory
박준용, 김대영, 문용혁
International Conference on Information and Communication Technology Convergence (ICTC) 2022, pp.495-497
22HS2700, 능동적 즉시 대응 및 빠른 학습이 가능한 적응형 경량 엣지 연동분석 기술개발, 문용혁
Modern edge devices have become powerful enough to run deep learning tasks, but there are still many challenges, such as limited resources such as computing power, memory space, and energy. To address these challenges, methods such as channel pruning, network quantization and early exiting has been introduced to reduce the computational load for achieve this tasks. In this paper, we propose LazyNet, an alternative network of applying skip modules instead of early exiting on a pre-trained neural network. We use a small module that preserves the spatial information and also provides metrics to decide the computational flow. If the data sample is easy, the network skips most of the computation load and if not, the network computes the sample for accurate classification. We test our model with various backbone networks and cifar-10 dataset and show reduction on model inference time, memory consumption and increased accuracy to prove our results.
KSP 제안 키워드
Backbone Network, CIFAR-10, Computing power, Data samples, Edge devices, Limited resources, Memory space, Model Inference, computation load, computational load, deep learning(DL)