ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Journal Article Consolidation of Subtasks for Target Task in Pipelined NLP Model
Cited 11 time in scopus Download 11 time Share share facebook twitter linkedin kakaostory
Authors
Jeong-Woo Son, Heegeun Yoon, Seong-Bae Park, Keeseong Cho, Won Ryu
Issue Date
2014-10
Citation
ETRI Journal, v.36, no.5, pp.704-713
ISSN
1225-6463
Publisher
한국전자통신연구원 (ETRI)
Language
English
Type
Journal Article
DOI
https://dx.doi.org/10.4218/etrij.14.2214.0035
Project Code
13PR4300, Development of Knowledge Convergence Service Platform based on Multimedia Object Recognition, Cho Kee Seong
Abstract
Most natural language processing tasks depend on the outputs of some other tasks. Thus, they involve other tasks as subtasks. The main problem of this type of pipelined model is that the optimality of the subtasks that are trained with their own data is not guaranteed in the final target task, since the subtasks are not optimized with respect to the target task. As a solution to this problem, this paper proposes a consolidation of subtasks for a target task (CST2). In CST2, all parameters of a target task and its subtasks are optimized to fulfill the objective of the target task. CST2finds such optimized parameters through a backpropagation algorithm. In experiments in which text chunking is a target task and part-of-speech tagging is its subtask, CST2outperforms a traditional pipelined text chunker. The experimental results prove the effectiveness of optimizing subtasks with respect to the target task.
KSP Keywords
All parameters, Back Propagation Algorithm, NLP model, Natural Language Processing, Optimized parameters, Part of Speech(POS), Part-Of-Speech Tagging, text chunking