ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Journal Article Dual‐scale BERT using multi‐trait representations for holistic and trait‐specific essay grading
Cited 3 time in scopus Download 184 time Share share facebook twitter linkedin kakaostory
Authors
Minsoo Cho, Jin-Xia Huang, Oh-Woog Kwon
Issue Date
2024-02
Citation
ETRI Journal, v.46, no.1, pp.82-95
ISSN
1225-6463
Publisher
한국전자통신연구원
Language
English
Type
Journal Article
DOI
https://dx.doi.org/10.4218/etrij.2023-0324
Abstract
As automated essay scoring (AES) has progressed from handcrafted techniques to deep learning, holistic scoring capabilities have merged. However, specific trait assessment remains a challenge because of the limited depth of earlier methods in modeling dual assessments for holistic and multi‐trait tasks. To overcome this challenge, we explore providing comprehensive feedback while modeling the interconnections between holistic and trait representations. We introduce the DualBERT‐Trans‐CNN model, which combines transformer‐based representations with a novel dual‐scale bidirectional encoder representations from transformers (BERT) encoding approach at the document‐level. By explicitly leveraging multi‐trait representations in a multi‐task learning (MTL) framework, our DualBERT‐Trans‐CNN emphasizes the interrelation between holistic and trait‐based score predictions, aiming for improved accuracy. For validation, we conducted extensive tests on the ASAP++ and TOEFL11 datasets. Against models of the same MTL setting, ours showed a 2.0% increase in its holistic score. Additionally, compared with single‐task learning (STL) models, ours demonstrated a 3.6% enhancement in average multi‐trait performance on the ASAP++ dataset.
KSP Keywords
Automated Essay Scoring(AES), CNN model, Essay grading, Improved Accuracy, deep learning(DL), task learning
This work is distributed under the term of Korea Open Government License (KOGL)
(Type 4: : Type 1 + Commercial Use Prohibition+Change Prohibition)
Type 4: