ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Lazy Net: Lazy Entry Neural Networks for Accelerated and Efficient Inference
Cited 0 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Junyong Park, Dae-Young Kim, Yong-Hyuk Moon
Issue Date
2022-10
Citation
International Conference on Information and Communication Technology Convergence (ICTC) 2022, pp.495-497
Publisher
IEEE
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1109/ICTC55196.2022.9953031
Abstract
Modern edge devices have become powerful enough to run deep learning tasks, but there are still many challenges, such as limited resources such as computing power, memory space, and energy. To address these challenges, methods such as channel pruning, network quantization and early exiting has been introduced to reduce the computational load for achieve this tasks. In this paper, we propose LazyNet, an alternative network of applying skip modules instead of early exiting on a pre-trained neural network. We use a small module that preserves the spatial information and also provides metrics to decide the computational flow. If the data sample is easy, the network skips most of the computation load and if not, the network computes the sample for accurate classification. We test our model with various backbone networks and cifar-10 dataset and show reduction on model inference time, memory consumption and increased accuracy to prove our results.
KSP Keywords
Backbone Network, CIFAR-10, Computing power, Data samples, Edge devices, Limited resources, Memory space, Model Inference, computation load, computational load, deep learning(DL)