ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Nonparametric Gesture Labeling from Multi-modal Data
Cited 9 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Ju Yong Chang
Issue Date
2014-09
Citation
European Conference on Computer Vision (ECCV) 2014 (LNCS 8925), v.8925, pp.503-517
Publisher
Springer
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1007/978-3-319-16178-5_35
Abstract
We present a new gesture recognition method using multimodal data. Our approach solves a labeling problem, which means that gesture categories and their temporal ranges are determined at the same time. For that purpose, a generative probabilistic model is formalized and it is constructed by nonparametrically estimating multi-modal densities from a training dataset. In addition to the conventional skeletal joint based features, appearance information near the active hand in the RGB image is exploited to capture the detailed motion of fingers. The estimated log-likelihood function is used as the unary term for our Markov random field (MRF) model. The smoothness term is also incorporated to enforce temporal coherence of our model. The labeling results can then be obtained by the efficient dynamic programming technique. Experimental results demonstrate that our method provides effective gesture labeling results for the large-scale gesture dataset. Our method scores 0.8268 in the mean Jaccard index and is ranked 3rd in the gesture recognition track of the ChaLearn Looking at People (LAP) Challenge in 2014.
KSP Keywords
Appearance information, Dynamic programming technique, Generative probabilistic model, Gesture datasets, Gesture recognition, Jaccard index, Markov Random Field, RGB image, Recognition method, Temporal Coherence, large-scale