ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Intuitive Pointing Position Estimation for Large Scale Display Interaction in Top-view Depth Images
Cited 0 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Hye-mi Kim, Daehwan Kim, Yong Sun Kim, Ki-Hong Kim
Issue Date
2016-11
Citation
Asian Conference on Computer Vision (ACCV) 2016 : Workshops (LNCS 10118), pp.227-238
Publisher
Springer
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1007/978-3-319-54526-4_17
Abstract
In this paper, we propose an intuitive pointing position estimation method for large scale display interaction in top-view depth images. The depth sensor is mounted above the users?? head in order to avoid the sensor occluding the display. In order to estimate the pointing position, we detect the user's head and estimate the position of the user's eye. To calculate the center of the head, we propose a head segmentation method. We use an iterative binary partitioning method and a one-to-one correspondence method to detect and track the hands, respectively. The 3D positions of the head and hands were converted to the real world coordinates and the pointing position was estimated on the eye-hand ray intersecting with the large screen. Experimental results show that we improve the head detection rate applying our head segmentation method. Also, we calculate the pointing direction accuracy and the proposed method has a good performance compared with conventional methods even in dark environments.
KSP Keywords
Calculate the center, Conventional methods, Depth image, Depth sensor, Head Detection, Large screen, One-to-one correspondence, Partitioning method, Position Estimation, Real-world, Top-view