ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Journal Article Multi-Modal User Interaction Method Based on Gaze Tracking and Gesture Recognition
Cited 15 time in scopus Download 2 time Share share facebook twitter linkedin kakaostory
Authors
Heekyung Lee, Seong Yong Lim, Injae Lee, Jihun Cha, Dong-Chan Cho, Sunyoung Cho
Issue Date
2013-02
Citation
Signal Processing : Image Communication, v.28, no.2, pp.114-126
ISSN
0923-5965
Publisher
Elsevier
Language
English
Type
Journal Article
DOI
https://dx.doi.org/10.1016/j.image.2012.10.007
Project Code
12PR4200, Development of Interactive View Control Technologies for IPTV, Cha Jihun
Abstract
This paper presents a gaze tracking technology which provides a convenient human-centric interface for multimedia consumption without any wearable device. It enables a user to interact with various multimedia on a large display in distance by tracking user movement and acquiring high resolution eye images. This paper also presents a gesture recognition technology which is helpful to interact with scene descriptions in terms of controlling and rendering scene objects. It is based on Hidden Markov Model and CRF using a commercial depth sensor. And then, this paper shows a collaboration method with those new sensors and MPEG standards in order to achieve interoperability among interactive applications, new user interaction devices and users. © 2012 Elsevier B.V. All rights reserved.
KSP Keywords
Depth sensor, High-resolution, Interaction devices, Multi-modal User Interaction, Tracking technology, Wearable device, gaze tracking, gesture recognition technology, hidden Markov Model, human-centric, interactive applications