ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술지 Consolidation of Subtasks for Target Task in Pipelined NLP Model
Cited 11 time in scopus Download 1 time Share share facebook twitter linkedin kakaostory
저자
손정우, 윤희근, 박성배, 조기성, 류원
발행일
201410
출처
ETRI Journal, v.36 no.5, pp.704-713
ISSN
1225-6463
출판사
한국전자통신연구원 (ETRI)
DOI
https://dx.doi.org/10.4218/etrij.14.2214.0035
협약과제
13PR4300, 영상객체 인식기반 지식융합 서비스 플랫폼 개발, 조기성
초록
Most natural language processing tasks depend on the outputs of some other tasks. Thus, they involve other tasks as subtasks. The main problem of this type of pipelined model is that the optimality of the subtasks that are trained with their own data is not guaranteed in the final target task, since the subtasks are not optimized with respect to the target task. As a solution to this problem, this paper proposes a consolidation of subtasks for a target task (CST2). In CST2, all parameters of a target task and its subtasks are optimized to fulfill the objective of the target task. CST2finds such optimized parameters through a backpropagation algorithm. In experiments in which text chunking is a target task and part-of-speech tagging is its subtask, CST2outperforms a traditional pipelined text chunker. The experimental results prove the effectiveness of optimizing subtasks with respect to the target task.
키워드
Chained task learning, Pipelined nlp model, Task consolidation
KSP 제안 키워드
All parameters, NLP model, Natural Language Processing, Optimized parameters, Part of Speech(POS), Part-Of-Speech Tagging, Task consolidation, backpropagation algorithm, task learning, text chunking