ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술대회 TK-BERT: Effective Model of Language Representation using Topic-based Knowledge Graphs
Cited 1 time in scopus Download 0 time Share share facebook twitter linkedin kakaostory
저자
민찬욱, 안진현, 이태휘, 임동혁
발행일
202301
출처
International Conference on Ubiquitous Information Management and Communication (IMCOM) 2023, pp.1-4
DOI
https://dx.doi.org/10.1109/IMCOM56909.2023.10035573
협약과제
22HS4100, 빅데이터 대상의 빠른 질의 처리가 가능한 탐사 데이터 분석 지원 근사질의 DBMS 기술 개발, 이태휘
초록
Recently, the K-BERT model was proposed to add knowledge for language representation in specialized fields. The K-BERT model uses a knowledge graph to perform transfer learning on the pre-trained BERT model. However, the K-BERT model adds the knowledge that exists in the knowledge graph rather than the data relevant to the topic of the input data when using the knowledge graph of the corresponding field. Hence, the K-BERT model can cause confusion in the training. To solve this problem, this study proposes a topic-based knowledge graph BERT (TK-BERT) model, which uses the topic modeling technique. The TK-BERT model divides the knowledge graph by topic using the knowledge graph's topic model and infers the topic for the input sentence, adding only knowledge relevant to the topic. Therefore, the TK-BERT model does not add unnecessary knowledge to the knowledge graph. Moreover, the proposed TK-BERT model outperforms the K-BERT model.
KSP 제안 키워드
Effective model, Language representation, Modeling techniques, Topic Modeling, Topic-based, Transfer learning, input data, knowledge graph