ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술지 Dynamic Inference Approach Based on Rules Engine in Intelligent Edge Computing for Building Environment Control
Cited 4 time in scopus Download 28 time Share share facebook twitter linkedin kakaostory
저자
Wenquan Jin, Rongxu Xu, 임선환, 박동환, 박찬원, 김도현
발행일
202101
출처
Sensors, v.21 no.2, pp.1-21
ISSN
1424-8220
출판사
MDPI
DOI
https://dx.doi.org/10.3390/s21020630
협약과제
21HR4500, 5G-IoT 기반 고신뢰 AI-데이터 커먼즈 프레임워크 핵심기술 개발, 임선환
초록
Computation offloading enables intensive computational tasks in edge computing to be separated into multiple computing resources of the server to overcome hardware limitations. Deep learning derives the inference approach based on the learning approach with a volume of data using a sufficient computing resource. However, deploying the domain-specific inference approaches to edge computing provides intelligent services close to the edge of the networks. In this paper, we propose intelligent edge computing by providing a dynamic inference approach for building environment control. The dynamic inference approach is provided based on the rules engine that is deployed on the edge gateway to select an inference function by the triggered rule. The edge gateway is deployed in the entry of a network edge and provides comprehensive functions, including device management, device proxy, client service, intelligent service and rules engine. The functions are provided by microservices provider modules that enable flexibility, extensibility and light weight for offloading domain-specific solutions to the edge gateway. Additionally, the intelligent services can be updated through offloading the microservices provider module with the inference models. Then, using the rules engine, the edge gateway operates an intelligent scenario based on the deployed rule profile by requesting the inference model of the intelligent service provider. The inference models are derived by training the building user data with the deep learning model using the edge server, which provides a high-performance computing resource. The intelligent service provider includes inference models and provides intelligent functions in the edge gateway using a constrained hardware resource based on microservices. Moreover, for bridging the Internet of Things (IoT) device network to the Internet, the gateway provides device management and proxy to enable device access to web clients.
키워드
Computational offloading, Deep learning, Edge computing, Inference model, Rules engine
KSP 제안 키워드
Building environment, Computation offloading, Computational offloading, Computing resources, Device access, Domain-specific, Environment Control, High Performance Computing, Internet of thing(IoT), Learning approach, Learning model
본 저작물은 크리에이티브 커먼즈 저작자 표시 (CC BY) 조건에 따라 이용할 수 있습니다.
저작자 표시 (CC BY)