ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper TK-BERT: Effective Model of Language Representation using Topic-based Knowledge Graphs
Cited 2 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Chanwook Min, Jinhyun Ahn, Taewhi Lee, Dong-Hyuk Im
Issue Date
2023-01
Citation
International Conference on Ubiquitous Information Management and Communication (IMCOM) 2023, pp.1-4
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1109/IMCOM56909.2023.10035573
Abstract
Recently, the K-BERT model was proposed to add knowledge for language representation in specialized fields. The K-BERT model uses a knowledge graph to perform transfer learning on the pre-trained BERT model. However, the K-BERT model adds the knowledge that exists in the knowledge graph rather than the data relevant to the topic of the input data when using the knowledge graph of the corresponding field. Hence, the K-BERT model can cause confusion in the training. To solve this problem, this study proposes a topic-based knowledge graph BERT (TK-BERT) model, which uses the topic modeling technique. The TK-BERT model divides the knowledge graph by topic using the knowledge graph's topic model and infers the topic for the input sentence, adding only knowledge relevant to the topic. Therefore, the TK-BERT model does not add unnecessary knowledge to the knowledge graph. Moreover, the proposed TK-BERT model outperforms the K-BERT model.
KSP Keywords
Effective model, Language representation, Modeling techniques, Topic Modeling, Topic-based, Transfer learning, input data, knowledge graph