ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Deep Multimodal Emotion Recognition using Modality Aware Attention Network for Unifying Representations in Neural Models
Cited - time in scopus Share share facebook twitter linkedin kakaostory
Authors
Sungpil Woo, Muhammad Zubair, Sunhwan Lim, Daeyoung Kim
Issue Date
2023-12
Citation
Conference on Neural Information Processing Systems (NeurIPS) 2023 : Workshop, pp.1-5
Language
English
Type
Conference Paper
Abstract
This paper introduces a multi-modal emotion recognition system aimed at enhancing emotion recognition by integrating representations from physiological signals. To accomplish this goal, we introduce a modality aware attention network to extract emotion-specific features by influencing and aligning the representation spaces of various modalities into a unified entity. Through a series of experiments and visualizations conducted on the AMIGO dataset, we demonstrate the efficacy of our proposed methodology for emotion classification, highlighting its capability to provide comprehensive representations of physiological signals.
KSP Keywords
Emotion classification, Multimodal emotion recognition, Neural models, Physiological signals, Specific features, emotion recognition system(ERS)