ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Journal Article Dynamic Inference Approach Based on Rules Engine in Intelligent Edge Computing for Building Environment Control
Cited 22 time in scopus Download 205 time Share share facebook twitter linkedin kakaostory
Authors
Wenquan Jin, Rongxu Xu, Sunhwan Lim, Dong-Hwan Park, Chanwon Park, Dohyeun Kim
Issue Date
2021-01
Citation
Sensors, v.21, no.2, pp.1-21
ISSN
1424-8220
Publisher
MDPI
Language
English
Type
Journal Article
DOI
https://dx.doi.org/10.3390/s21020630
Abstract
Computation offloading enables intensive computational tasks in edge computing to be separated into multiple computing resources of the server to overcome hardware limitations. Deep learning derives the inference approach based on the learning approach with a volume of data using a sufficient computing resource. However, deploying the domain-specific inference approaches to edge computing provides intelligent services close to the edge of the networks. In this paper, we propose intelligent edge computing by providing a dynamic inference approach for building environment control. The dynamic inference approach is provided based on the rules engine that is deployed on the edge gateway to select an inference function by the triggered rule. The edge gateway is deployed in the entry of a network edge and provides comprehensive functions, including device management, device proxy, client service, intelligent service and rules engine. The functions are provided by microservices provider modules that enable flexibility, extensibility and light weight for offloading domain-specific solutions to the edge gateway. Additionally, the intelligent services can be updated through offloading the microservices provider module with the inference models. Then, using the rules engine, the edge gateway operates an intelligent scenario based on the deployed rule profile by requesting the inference model of the intelligent service provider. The inference models are derived by training the building user data with the deep learning model using the edge server, which provides a high-performance computing resource. The intelligent service provider includes inference models and provides intelligent functions in the edge gateway using a constrained hardware resource based on microservices. Moreover, for bridging the Internet of Things (IoT) device network to the Internet, the gateway provides device management and proxy to enable device access to web clients.
KSP Keywords
Building environment, Computation offloading, Computing resources, Device access, Domain-specific, Edge Computing, High-performance computing(HPC), Learning approach, Light-weight, Rules engine, Service Provider
This work is distributed under the term of Creative Commons License (CCL)
(CC BY)
CC BY