ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술지 Multi-task Stack Propagation for Neural Quality Estimation
Cited 5 time in scopus Download 9 time Share share facebook twitter linkedin kakaostory
저자
김현, 이종혁, 나승훈
발행일
201908
출처
ACM Transactions on Asian and Low-Resource Language Information Processing, v.18 no.4, pp.1-18
ISSN
2375-4699
출판사
ACM
DOI
https://dx.doi.org/10.1145/3321127
협약과제
19HS3200, (엑소브레인-1세부) 휴먼 지식증강 서비스를 위한 지능진화형 WiseQA 플랫폼 기술 개발, 김현기
초록
Quality estimation is an important task in machine translation that has attracted increased interest in recent years. A key problem in translation-quality estimation is the lack of a sufficient amount of the quality annotated training data. To address this shortcoming, the Predictor-Estimator was proposed recently by introducing ?쐗ord prediction?? as an additional pre-subtask that predicts a current target word with consideration of surrounding source and target contexts, resulting in a two-stage neural model composed of a predictor and an estimator. However, the original Predictor-Estimator is not trained on a continuous stacking model but instead in a cascaded manner that separately trains the predictor from the estimator. In addition, the Predictor-Estimator is trained based on single-task learning only, which uses target-specific quality-estimation data without using other training data that are available from other-level quality-estimation tasks. In this article, we thus propose a multi-task stack propagation, which extensively applies stack propagation to fully train the Predictor-Estimator on a continuous stacking architecture and multi-task learning to enhance the training data from related other-level quality-estimation tasks. Experimental results on WMT17 quality-estimation datasets show that the Predictor-Estimator trained with multi-task stack propagation provides statistically significant improvements over the baseline models. In particular, under an ensemble setting, the proposed multi-task stack propagation leads to state-of-the-art performance at all the sentence/word/phrase levels for WMT17 quality estimation tasks.
KSP 제안 키워드
Art performance, Estimation quality, Estimation tasks, Machine Translation(MT), Neural models, Quality estimation, Two-Stage, multi-task learning, state-of-The-Art, training data