ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술지 DG-based SPO Tuple Recognition using Self-attention M-Bi- LSTM
Cited 4 time in scopus Download 131 time Share share facebook twitter linkedin kakaostory
저자
정준영
발행일
202206
출처
ETRI Journal, v.44 no.3, pp.438-449
ISSN
1225-6463
출판사
한국전자통신연구원 (ETRI)
DOI
https://dx.doi.org/10.4218/etrij.2020-0460
협약과제
20ZS1100, 자율성장형 복합인공지능 원천기술 연구, 송화전
초록
This study proposes a dependency grammar-based self-attention multilayered bidirectional long short-term memory (DG-M-Bi-LSTM) model for subject?뱎redicate?뱋bject (SPO) tuple recognition from natural language (NL) sentences. To add recent knowledge to the knowledge base autonomously, it is essential to extract knowledge from numerous NL data. Therefore, this study proposes a high-accuracy SPO tuple recognition model that requires a small amount of learning data to extract knowledge from NL sentences. The accuracy of SPO tuple recognition using DG-M-Bi-LSTM is compared with that using NL-based self-attention multilayered bidirectional LSTM, DG-based bidirectional encoder representations from transformers (BERT), and NL-based BERT to evaluate its effectiveness. The DG-M-Bi-LSTM model achieves the best results in terms of recognition accuracy for extracting SPO tuples from NL sentences even if it has fewer deep neural network (DNN) parameters than BERT. In particular, its accuracy is better than that of BERT when the learning data are limited. Additionally, its pretrained DNN parameters can be applied to other domains because it learns the structural relations in NL sentences.
KSP 제안 키워드
Beacon Interval(BI), Bi-LSTM, Bidirectional Long Short-Term Memory, Deep neural network(DNN), High accuracy, Learning data, Long-short term memory(LSTM), Recognition Accuracy, Recognition model, bidirectional LSTM, dependency grammar
본 저작물은 공공누리 제4유형 : 출처표시 + 상업적 이용금지 + 변경금지 조건에 따라 이용할 수 있습니다.
제4유형