ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Journal Article Multi-task Stack Propagation for Neural Quality Estimation
Cited 5 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Hyun Kim, Jong-Hyeok Lee, Seung-Hoon Na
Issue Date
2019-08
Citation
ACM Transactions on Asian and Low-Resource Language Information Processing, v.18, no.4, pp.1-18
ISSN
2375-4699
Publisher
ACM
Language
English
Type
Journal Article
DOI
https://dx.doi.org/10.1145/3321127
Abstract
Quality estimation is an important task in machine translation that has attracted increased interest in recent years. A key problem in translation-quality estimation is the lack of a sufficient amount of the quality annotated training data. To address this shortcoming, the Predictor-Estimator was proposed recently by introducing ?쐗ord prediction?? as an additional pre-subtask that predicts a current target word with consideration of surrounding source and target contexts, resulting in a two-stage neural model composed of a predictor and an estimator. However, the original Predictor-Estimator is not trained on a continuous stacking model but instead in a cascaded manner that separately trains the predictor from the estimator. In addition, the Predictor-Estimator is trained based on single-task learning only, which uses target-specific quality-estimation data without using other training data that are available from other-level quality-estimation tasks. In this article, we thus propose a multi-task stack propagation, which extensively applies stack propagation to fully train the Predictor-Estimator on a continuous stacking architecture and multi-task learning to enhance the training data from related other-level quality-estimation tasks. Experimental results on WMT17 quality-estimation datasets show that the Predictor-Estimator trained with multi-task stack propagation provides statistically significant improvements over the baseline models. In particular, under an ensemble setting, the proposed multi-task stack propagation leads to state-of-the-art performance at all the sentence/word/phrase levels for WMT17 quality estimation tasks.
KSP Keywords
Art performance, Estimation quality, Estimation tasks, Machine Translation(MT), Neural model, Quality estimation, Two-Stage, multi-task learning, state-of-The-Art, training data