ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper QEBERT: Bilingual BERT using Multi-task Learning for Neural Quality Estimation
Cited - time in scopus Share share facebook twitter linkedin kakaostory
Authors
Hyun Kim, Joon-Ho Lim, Hyun-Ki Kim, Seung-Hoon Na
Issue Date
2019-08
Citation
Conference on Machine Translation (WMT) 2019, pp.85-89
Publisher
Association for Computational Linguistics
Language
English
Type
Conference Paper
Abstract
For translation quality estimation at word and sentence levels, this paper presents a novel approach based on BERT that recently has achieved impressive results on various natural language processing tasks. Our proposed model is re-purposed BERT for the translation quality estimation and uses multi-task learning for the sentence-level task and word-level subtasks (i.e., source word, target word, and target gap). Experimental results on Quality Estimation shared task of WMT19 show that our systems show competitive results and provide significant improvements over the baseline.
KSP Keywords
Natural Language Processing, Novel approach, Proposed model, Quality estimation, Shared task, Translation quality, multi-task learning, word-level