ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Haptic Music Feedback through Audio Decomposition
Cited 0 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Sungyong Shin, HyeonBeom Yi, Junsuk Seo, Chi Yoon Jeong, Chang Hee Lee, Woohun Lee, Juhan Nam
Issue Date
2025-09
Citation
ACM Symposium on User Interface Software and Technology (UIST) 2025, pp.1-3
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1145/3746058.3758996
Abstract
This study presents a novel approach to improving musical haptic wearables by applying deep learning models for music source separation and pitch estimation. The system consists of a haptic vest and a pair of haptic gloves, designed to spatially convey musical elements throughout the body. By isolating instruments from an audio file and recognizing their pitches, our system provides intuitive haptic feedback that discretely represents each instrument’s performance. Specifically, we map the piano to the gloves, the bass to the back of the vest, and the drums to the front of the vest. Then, we conducted a comparative study with a conventional audio-to-haptic method. The results showed that users experienced improved clarity, intuitiveness, and comprehension, indicating enhanced understanding of musical structure.
KSP Keywords
Audio File, Haptic Feedback, Haptic vest, Haptic wearables, Music source separation(MSS), Musical Structure, Novel approach, comparative study, deep learning(DL), deep learning models, pitch estimation