ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Implementation of Generative Model Based Solver for Mathematical Word Problem with Linear Equations
Cited 0 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Gayoung Kim, Seonho Kim, Junseong Bang
Issue Date
2021-08
Citation
International Conference on Platform Technology and Service (PlatCon) 2021, pp.52-56
Publisher
IEEE
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1109/PlatCon53246.2021.9680762
Abstract
Solving math word problems automatically with a computer is an interesting topic. Instead of statistical methods and semantic parsing methods, recently, deep learning model based methods are used to solve MWPs. We experimented with different deep learning generative model that directly translates a math word problem into a linear equation. In this paper, four MWP solvers using the Sequence-To-Sequence (Seq2Seq) model with a attention mechanism were implemented, i.e., Seq2Seq, BiLSTM Seq2Seq, convolutional Seq2Seq, and transformer models. Then, performance analysis for the 4 MWP solvers has performed on MaWPS (English) and Math23K (Chinese) MWP datasets. Experiment shows that both the Seq2Seq model and the transformer model showed similar performance in translating into simple linear equations, but the transformer model showed the best performance in translating into more complex linear equations.
KSP Keywords
Attention mechanism, Best performance, Generative models, Linear equations solving, Mathematical word problem, Model-based method, Performance analysis, Semantic parsing, Statistical methods, deep learning(DL), learning models