ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper A Study on Performance Analysis of Question Generation based on Korean Pretrained Language Model
Cited 0 time in scopus Share share facebook twitter linkedin kakaostory
Authors
HongYeon Yu, Jiwon Yang, Seunghun Oh, Donghoon Son, Aram Lee, Jeongeun Kim
Issue Date
2023-10
Citation
International Conference on Information and Communication Technology Convergence (ICTC) 2023, pp.1239-1241
Publisher
IEEE
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1109/ICTC58733.2023.10393015
Abstract
This paper presents the results of a performance analysis of the question generation task based on a Korean pretrained language model for assessing elementary school students’ reading comprehension using literary texts. The Korean pretrained language model is based on SKT-KoBart and employs transfer learning using literary texts of narrative and expository structures, along with the KorQuAd dataset for question-answering tasks. Through the extraction of factual and inferential assessment items, BLEU scores are measured based on the type of literary text, and the performance analysis results of the downstream task of question generation are presented.
KSP Keywords
Language model, Literary text, Performance analysis, Question generation, Reading comprehension, Transfer learning, elementary school students, question answering