ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Emotion-Aware Autonomous Driving: A VR-Based Multimodal Analysis Using Gaze and Physiological Signals
Cited 0 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Mi Chang, Jiwoo Han, Woojin Kim, Daesub Yoon
Issue Date
2025-12
Citation
ACM SIGGRAPH Asia (SA) 2025, pp.1-2
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1145/3757374.3771514
Abstract
While autonomous driving technology continues to advance, understanding passengers’ emotional experiences remains limited. This study examines emotion recognition in fully autonomous scenarios using a virtual reality (VR) simulation with two contrasting driving styles— Calm and Tense. Multimodal data analysis revealed significant behavioral and physiological differences between the two modes. A Random Forest classifier achieved 97.1% accuracy in distinguishing passenger emotional states, demonstrating the feasibility of emotionaware autonomous driving systems.
KSP Keywords
Autonomous driving system, Data analysis, Driving styles, Emotion Recognition, Emotion-aware, Emotional states, Multimodal analysis, Physiological signals, Random Forest Classifier, Virtual Reality, multimodal data