ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Backward Graph Construction and Lowering in DL Compiler for Model Training on AI Accelerators
Cited 2 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Hyunjeong Kwon, Youngsu Kwon, Jinho Han
Issue Date
2022-10
Citation
International SoC Design Conference (ISOCC) 2022, pp.91-92
Publisher
반도체공학회
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1109/ISOCC56007.2022.10031488
Abstract
A deep learning (DL) compiler is required to acceler ate model inference and training on AI accelerators. In this work, we propose a novel approach to constructing a backward graph from a PyTorch model, and lowering it to machine codes. The backward graph is constructed using information from PyTorch's autograd engine. The newly proposed lexer and parser convert the generated graph into an abstract syntax tree (AST). The AST is converted to GIR, an intermediate representation within the MLIR framework. IR lowering is then applied to the GIR, producing an LLVM IR for the LLVM backend. Among operators, those that can be accelerated using a DL accelerator are processed by the accelerator. This is achieved through the LLVM IR call function, which calls the accelerator's backend. In the experiment, the proposed compiler estimated the training loss with an average error of 1.46% within 6.7 seconds while processing 100 epochs.
KSP Keywords
Average error, Intermediate representation, LLVM IR, Machine code, Model Inference, Novel approach, Using information, abstract syntax tree, deep learning(DL), graph construction