ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술대회 SGToolkit: An Interactive Gesture Authoring Toolkit for Embodied Conversational Agents
Cited 14 time in scopus Download 9 time Share share facebook twitter linkedin kakaostory
저자
윤영우, 박근우, 장민수, 김재홍, 이기혁
발행일
202110
출처
ACM Symposium on User Interface Software and Technology (UIST) 2021, pp.826-840
DOI
https://dx.doi.org/10.1145/3472749.3474789
협약과제
21HS1500, 고령 사회에 대응하기 위한 실환경 휴먼케어 로봇 기술 개발, 이재연
초록
Non-verbal behavior is essential for embodied agents like social robots, virtual avatars, and digital humans. Existing behavior authoring approaches including keyframe animation and motion capture are too expensive to use when there are numerous utterances requiring gestures. Automatic generation methods show promising results, but their output quality is not satisfactory yet, and it is hard to modify outputs as a gesture designer wants. We introduce a new gesture generation toolkit, named SGToolkit, which gives a higher quality output than automatic methods and is more efficient than manual authoring. For the toolkit, we propose a neural generative model that synthesizes gestures from speech and accommodates fine-level pose controls and coarse-level style controls from users. The user study with 24 participants showed that the toolkit is favorable over manual authoring, and the generated gestures were also human-like and appropriate to input speech. The SGToolkit is platform agnostic, and the code is available at https://github.com/ai4r/SGToolkit.
KSP 제안 키워드
Automatic generation, Behavior authoring, Embodied Conversational Agents(ECAs), Gesture generation, Human-like, Motion capture, Output quality, Quality output, User study, Virtual avatar, generative models