ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Journal Article Interworking Technology of Neural Network and Data among Deep Learning Frameworks
Cited 2 time in scopus Download 236 time Share share facebook twitter linkedin kakaostory
Authors
Jaebok Park, Seungmok Yoo, Seokjin Yoon, Kyunghee Lee, Changsik Cho
Issue Date
2019-12
Citation
ETRI Journal, v.41, no.6, pp.760-770
ISSN
1225-6463
Publisher
한국전자통신연구원 (ETRI)
Language
English
Type
Journal Article
DOI
https://dx.doi.org/10.4218/etrij.2018-0135
Abstract
Based on the growing demand for neural network technologies, various neural network inference engines are being developed. However, each inference engine has its own neural network storage format. There is a growing demand for standardization to solve this problem. This study presents interworking techniques for ensuring the compatibility of neural networks and data among the various deep learning frameworks. The proposed technique standardizes the graphic expression grammar and learning data storage format using the Neural Network Exchange Format (NNEF) of Khronos. The proposed converter includes a lexical, syntax, and parser. This NNEF parser converts neural network information into a parsing tree and quantizes data. To validate the proposed system, we verified that MNIST is immediately executed by importing AlexNet's neural network and learned data. Therefore, this study contributes an efficient design technique for a converter that can execute a neural network and learned data in various frameworks regardless of the storage format of each framework.
KSP Keywords
Deep learning framework, Design techniques, Exchange format, Graphic expression, Inference engine, Learning data, Network Storage, Network Technology, Network information, Parsing tree, Storage Format
This work is distributed under the term of Korea Open Government License (KOGL)
(Type 4: : Type 1 + Commercial Use Prohibition+Change Prohibition)
Type 4: