ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Gesture Analysis for Human-Robot Interaction
Cited 30 time in scopus Download 0 time Share share facebook twitter linkedin kakaostory
Authors
Kye Kyung Kim, Keun Chang Kwak, Su Young Chi
Issue Date
2006-02
Citation
International Conference on Advanced Communication Technology (ICACT) 2006, pp.1824-1827
Language
English
Type
Conference Paper
Project Code
06MC1500, Software Development for Digital Image Production of Photo-Realistic Quality, Lee In Ho
Abstract
This paper is to present gesture analysis for human-robot interaction. Gesture analysis is consisted of four processes such as detecting of hand in bimanual movements, splitting of a meaning gesture region from image stream, extracting features and recognizing gesture. Skin color analysis, image motion detection and shape information are used to detect bimanual hand movements and gesture spotting. Skin color information for tracking hand gesture is obtained from face detection region. The velocity of moving hand is calculated for detecting a meaning gesture region from consecutive image frames. Combined gesture features such as structural and statistical features are extracted from image stream. We have experimented to evaluate detection of bimanual hand movements and gesture recognition with a camera, which is pan/tilt and a single camera that is mounted on mobile robot. Performance evaluation of gesture recognition has experimented using ETRI database and an encouraging recognition rate of 89 % has been obtained.
KSP Keywords
Detection Region, Face detection, Gesture analysis, Gesture recognition, Gesture spotting, Hand gestures, Human-Robot Interaction(HRI), Image motion, Mobile robots, Motion detection, Performance evaluation