ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술대회 Multimodal Data Fusion and Intention Recognition for Horse Riding Simulators
Cited 0 time in scopus Download 1 time Share share facebook twitter linkedin kakaostory
저자
강상승, 김계경, 지수영
발행일
201510
출처
International Conference on Big Data Applications and Services (BigDAS) 2015, pp.1-2
DOI
https://dx.doi.org/10.1145/2837060.2837124
협약과제
15MC1800, 다축 모션 플랫폼을 기반으로 한 범용 오감 융합형 스포츠 시뮬레이터 개발, 지수영
초록
For natural interaction, a substantial interactive process between human and simulator system must be provided. The process usually requires continuous intention recognition. In this paper, we present an intention recognition system through multimodal data fusion using multiple sensors in horse riding simulator environments. The system provides an effective interactive function based on the riding intention recognition. The system has adopted certain schemes for multimodal sensor data acquisition, multimodal feature extraction and fusion, and template matching. It is possible to provide a depth of expression and realistic interaction through recognition for the user's riding intentions. Copyright is held by the owner/author(s). Publication rights licensed to ACM.
KSP 제안 키워드
Data Acquisition(DAQ), Feature extractioN, Horse riding simulator, Interactive process, Multimodal sensor, Natural Interaction, Recognition System, Sensor data acquisition, Template matching, intention recognition, multimodal data fusion