ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Journal Article Simultaneous Neural Machine Translation with a Reinforced Attention Mechanism
Cited 11 time in scopus Download 259 time Share share facebook twitter linkedin kakaostory
Authors
YoHan Lee, JongHun Shin, YoungKil Kim
Issue Date
2021-10
Citation
ETRI Journal, v.43, no.5, pp.775-786
ISSN
1225-6463
Publisher
한국전자통신연구원 (ETRI)
Language
English
Type
Journal Article
DOI
https://dx.doi.org/10.4218/etrij.2020-0358
Abstract
To translate in real time, a simultaneous translation system should determine when to stop reading source tokens and generate target tokens corresponding to a partial source sentence read up to that point. However, conventional attention-based neural machine translation (NMT) models cannot produce translations with adequate latency in online scenarios because they wait until a source sentence is completed to compute alignment between the source and target tokens. To address this issue, we propose a reinforced learning (RL)-based attention mechanism, the reinforced attention mechanism, which allows a neural translation model to jointly train the stopping criterion and a partial translation model. The proposed attention mechanism comprises two modules, one to ensure translation quality and the other to address latency. Different from previous RL-based simultaneous translation systems, which learn the stopping criterion from a fixed NMT model, the modules can be trained jointly with a novel reward function. In our experiments, the proposed model has better translation quality and comparable latency compared to previous models.
KSP Keywords
Attention mechanism, Machine Translation(MT), Neural machine translation, Proposed model, Real-Time, Reinforced learning, Stopping criterion, Translation Model, Translation quality, Translation system, reward function
This work is distributed under the term of Korea Open Government License (KOGL)
(Type 4: : Type 1 + Commercial Use Prohibition+Change Prohibition)
Type 4: