ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Multimodal Data Fusion and Intention Recognition for Horse Riding Simulators
Cited 0 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Sangseung Kang, Kyekyung Kim, Suyoung Chi
Issue Date
2015-10
Citation
International Conference on Big Data Applications and Services (BigDAS) 2015, pp.1-2
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1145/2837060.2837124
Abstract
For natural interaction, a substantial interactive process between human and simulator system must be provided. The process usually requires continuous intention recognition. In this paper, we present an intention recognition system through multimodal data fusion using multiple sensors in horse riding simulator environments. The system provides an effective interactive function based on the riding intention recognition. The system has adopted certain schemes for multimodal sensor data acquisition, multimodal feature extraction and fusion, and template matching. It is possible to provide a depth of expression and realistic interaction through recognition for the user's riding intentions. Copyright is held by the owner/author(s). Publication rights licensed to ACM.
KSP Keywords
Data Acquisition(DAQ), Feature extractioN, Horse riding simulator, Intention recognition, Interactive process, Multimodal sensor, Natural interaction, Recognition system, Sensor data acquisition, Template matching, multimodal data fusion