ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Inference Engine for Embedded Systems Supporting Multi-processors
Cited - time in scopus Share share facebook twitter linkedin kakaostory
Authors
Seung-mok Yoo, Changsik Cho, Kyung Hee Lee, Jaebok Park, Seok Jin Yoon, Youngwoon Lee, Byung-Gyu Kim
Issue Date
2020-12
Citation
International Conference on Interdisciplinary Research on Computer Science, Psychology, and Education (ICIPE) 2020, v.4, no.1, pp.157-163
Language
English
Type
Conference Paper
Abstract
In this paper, we introduce the progress of our project on developing an open deep learning inference engine for embedded systems. The engine was designed to be installed in autonomous vehicles to perform fast neural network inference such as image recognition. Due to the nature of the embedded system, various types of hardware can be used. Some AI applications, e.g., object detection in autonomous vehicle, should be handled in real time. To meet these requirements, our inference engine was designed to operate on multiple OpenCL devices. It is because common of the shelf GPUs support OpenCL. This paper also describes briefly how we implemented the multi-processor support part, which was introduced in the previous work. It shows performance improvements as the number of processors for arithmetic computing changes.
KSP Keywords
AI Applications, Autonomous vehicle, Image recognition, Inference engine, Multi-processor, Real-time, Support part, deep learning(DL), embedded system, network inference, neural network(NN)