ETRI-Knowledge Sharing Plaform



논문 검색
구분 SCI
연도 ~ 키워드


학술대회 Multimodal Data Fusion and Intention Recognition for Horse Riding Simulators
Cited 0 time in scopus Download 1 time Share share facebook twitter linkedin kakaostory
강상승, 김계경, 지수영
International Conference on Big Data Applications and Services (BigDAS) 2015, pp.1-2
For natural interaction, a substantial interactive process between human and simulator system must be provided. The process usually requires continuous intention recognition. In this paper, we present an intention recognition system through multimodal data fusion using multiple sensors in horse riding simulator environments. The system provides an effective interactive function based on the riding intention recognition. The system has adopted certain schemes for multimodal sensor data acquisition, multimodal feature extraction and fusion, and template matching. It is possible to provide a depth of expression and realistic interaction through recognition for the user's riding intentions. Copyright is held by the owner/author(s). Publication rights licensed to ACM.
KSP 제안 키워드
Data Acquisition(DAQ), Feature extractioN, Horse riding simulator, Interactive process, Multimodal sensor, Natural Interaction, Recognition System, Sensor data acquisition, Template matching, intention recognition, multimodal data fusion