ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술대회 Implementation of Generative Model Based Solver for Mathematical Word Problem with Linear Equations
Cited 0 time in scopus Download 2 time Share share facebook twitter linkedin kakaostory
저자
김가영, 김선호, 방준성
발행일
202108
출처
International Conference on Platform Technology and Service (PlatCon) 2021, pp.52-56
DOI
https://dx.doi.org/10.1109/PlatCon53246.2021.9680762
협약과제
21IR1800, 대화형 치안 지식 서비스 폴봇 개발, 방준성
초록
Solving math word problems automatically with a computer is an interesting topic. Instead of statistical methods and semantic parsing methods, recently, deep learning model based methods are used to solve MWPs. We experimented with different deep learning generative model that directly translates a math word problem into a linear equation. In this paper, four MWP solvers using the Sequence-To-Sequence (Seq2Seq) model with a attention mechanism were implemented, i.e., Seq2Seq, BiLSTM Seq2Seq, convolutional Seq2Seq, and transformer models. Then, performance analysis for the 4 MWP solvers has performed on MaWPS (English) and Math23K (Chinese) MWP datasets. Experiment shows that both the Seq2Seq model and the transformer model showed similar performance in translating into simple linear equations, but the transformer model showed the best performance in translating into more complex linear equations.
KSP 제안 키워드
Attention mechanism, Best performance, Learning model, Linear equations solving, Mathematical word problem, Model-based method, Performance analysis, Semantic parsing, Statistical methods, deep learning(DL), generative models