ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Drug-BERT: Pre-trained Language Model Specialized for Korean Drug Crime
Cited 0 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Jeong Min Lee, Suyeon Lee, Sungwon Byon, Eui-Suk Jung, Myung-Sun Baek
Issue Date
2024-06
Citation
International Symposium on Broadband Multimedia Systems and Broadcasting (BMSB) 2024, pp.1-3
Publisher
IEEE
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1109/BMSB62888.2024.10608314
Abstract
We propose Drug-BERT, a specialized pretrained language model designed for detecting drug-related content in the Korean language. Given the severity of the current drug issue in South Korea, effective responses are imperative. Focusing on the distinctive features of drug slang, this study seeks to improve the identification and classification of drug-related posts on social media platforms. Recent drug slangs are gathered and used to collect drug-related posts, and the collected data is used to train the language model. The designed pre-trained model is DRUG-BERT. The results show that fine-tuned DRUG-BERT outperforms that of the comparative models, achieving a 99.43% accuracy in classifying drug-relevant posts. Drug-BERT presents a promising solution for combatting drug-related activities, contributing to proactive measures against drug crimes in the Korean context.
KSP Keywords
Distinctive features, Drug crimes, Korean language, Language Model, Pre-trained model, Social media platforms, South Korea, related contents