ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Gesture Analysis for Human-Robot Interaction
Cited 30 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Kye Kyung Kim, Keun Chang Kwak, Su Young Chi
Issue Date
2006-02
Citation
International Conference on Advanced Communication Technology (ICACT) 2006, pp.1824-1827
Publisher
IEEE
Language
English
Type
Conference Paper
Abstract
This paper is to present gesture analysis for human-robot interaction. Gesture analysis is consisted of four processes such as detecting of hand in bimanual movements, splitting of a meaning gesture region from image stream, extracting features and recognizing gesture. Skin color analysis, image motion detection and shape information are used to detect bimanual hand movements and gesture spotting. Skin color information for tracking hand gesture is obtained from face detection region. The velocity of moving hand is calculated for detecting a meaning gesture region from consecutive image frames. Combined gesture features such as structural and statistical features are extracted from image stream. We have experimented to evaluate detection of bimanual hand movements and gesture recognition with a camera, which is pan/tilt and a single camera that is mounted on mobile robot. Performance evaluation of gesture recognition has experimented using ETRI database and an encouraging recognition rate of 89 % has been obtained.
KSP Keywords
Detection Region, Face detection, Gesture analysis, Gesture recognition, Gesture spotting, Hand Gesture, Hand movement, Human-Robot Interaction(HRI), Image motion, Mobile robots, Motion detection