ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술지 Vision-Based Arm Gesture Recognition for a Long-Range Human-Robot Interaction
Cited 19 time in scopus Download 6 time Share share facebook twitter linkedin kakaostory
저자
김도형, 이재연, 윤호섭, 김재홍, 손주찬
발행일
201307
출처
Journal of Supercomputing, v.65 no.1, pp.336-352
ISSN
0920-8542
출판사
Springer
DOI
https://dx.doi.org/10.1007/s11227-010-0541-9
협약과제
10MC4300, u-로봇 HRI 솔루션 및 핵심 소자 기술 개발, 조재일
초록
This paper proposes a vision-based human arm gesture recognition method for human-robot interaction, particularly at a long distance where speech information is not available. We define four meaningful arm gestures for a long-range interaction. The proposed method is capable of recognizing the defined gestures only with 320×240 pixel-sized low-resolution input images captured from a single camera at a long distance, approximately five meters from the camera. In addition, the system differentiates the target gestures from the users' normal actions that occur in daily life without any constraints. For human detection at a long distance, the proposed approach combines results from mean-shift color tracking, short- and long-range face detection, and omega shape detection. The system then detects arm blocks using a background subtraction method with a background updating module and recognizes the target gestures based on information about the region, periodical motion, and shape of the arm blocks. From experiments using a large realistic database, a recognition rate of 97.235% is achieved, which is a sufficiently practical level for various pervasive and ubiquitous applications based on human gestures. © 2010 Springer Science+Business Media, LLC.
KSP 제안 키워드
Arm gesture recognition, Background subtraction(BS), Color Tracking, Face detection, Human detection, Human-Robot Interaction(HRI), Long-distance, Long-range interaction, Mean-shift(MS), Omega shape, Recognition method