ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술지 Performance Analysis of Local Exit for Distributed Deep Neural Networks Over Cloud and Edge Computing
Cited 23 time in scopus Download 143 time Share share facebook twitter linkedin kakaostory
저자
이창식, 홍승우, 홍성백, 김태연
발행일
202010
출처
ETRI Journal, v.42 no.5, pp.658-668
ISSN
1225-6463
출판사
한국전자통신연구원 (ETRI)
DOI
https://dx.doi.org/10.4218/etrij.2020-0112
협약과제
20HH6400, 인공지능 기반 지능형 에지 네트워킹 기술개발, 김태연
초록
In edge computing, most procedures, including data collection, data processing, and service provision, are handled at edge nodes and not in the central cloud. This decreases the processing burden on the central cloud, enabling fast responses to end-device service requests in addition to reducing bandwidth consumption. However, edge nodes have restricted computing, storage, and energy resources to support computation-intensive tasks such as processing deep neural network (DNN) inference. In this study, we analyze the effect of models with single and multiple local exits on DNN inference in an edge-computing environment. Our test results show that a single-exit model performs better with respect to the number of local exited samples, inference accuracy, and inference latency than a multi-exit model at all exit points. These results signify that higher accuracy can be achieved with less computation when a single-exit model is adopted. In edge computing infrastructure, it is therefore more efficient to adopt a DNN model with only one or a few exit points to provide a fast and reliable inference service.
KSP 제안 키워드
Bandwidth consumption, Computing environment, Data Collection, Data processing, Deep neural network(DNN), Distributed deep neural networks, Performance analysis, Service Provision, Service requests, computing infrastructure, device service
본 저작물은 공공누리 제4유형 : 출처표시 + 상업적 이용금지 + 변경금지 조건에 따라 이용할 수 있습니다.
제4유형