ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Learning Co-Speech Gestures for Humanoid Robots
Cited - time in scopus Share share facebook twitter linkedin kakaostory
Authors
Youngwoo Yoon, Minsu Jang, Jaeyeon Lee, Jaehong Kim
Issue Date
2018-06
Citation
International Conference on Ubiquitous Robots (UR) 2018, pp.1-2
Publisher
IEEE
Language
English
Type
Conference Paper
Abstract
Social intelligence is an essential capability for robots when they lives and interacts with humans. Co-speech gesture, doing body gestures when talks as a human does, is one of required skills. Existing robots are showing co-speech gestures, but those are manually designed by human experts. We believe that the robots could gain social intelligence from human demonstrations. In this work-in-progress paper, we present a RNN-based model to understand speech text and infer a proper hand gesture. A preliminary evaluation with a NAO robot is also conducted and showed promising results.
KSP Keywords
Hand Gesture, Nao robot, Preliminary evaluation, Social Intelligence, body gestures, humanoid robot