ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Journal Article Simple and effective neural coreference resolution for Korean language
Cited 6 time in scopus Download 21 time Share share facebook twitter linkedin kakaostory
Authors
Cheoneum Park, Joonho Lim, Jihee Ryu, Hyunki Kim, Changki Lee
Issue Date
2021-12
Citation
ETRI Journal, v.43, no.6, pp.1038-1048
ISSN
1225-6463
Publisher
한국전자통신연구원
Language
Korean
Type
Journal Article
DOI
https://dx.doi.org/10.4218/etrij.2020-0282
Abstract
We propose an end-to-end neural coreference resolution for the Korean language that uses an attention mechanism to point to the same entity. Because Korean is a head-final language, we focused on a method that uses a pointer network based on the head. The key idea is to consider all nouns in the document as candidates based on the head-final characteristics of the Korean language and learn distributions over the referenced entity positions for each noun. Given the recent success of applications using bidirectional encoder representation from transformer (BERT) in natural language-processing tasks, we employed BERT in the proposed model to create word representations based on contextual information. The experimental results indicated that the proposed model achieved state-of-the-art performance in Korean language coreference resolution.
This work is distributed under the term of Korea Open Government License (KOGL)
(Type 4: : Type 1 + Commercial Use Prohibition+Change Prohibition)
Type 4: