ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술지 AIR-Act2Act: Human-Human Interaction Dataset for Teaching Non-verbal Social Behaviors to Robots
Cited 16 time in scopus Download 31 time Share share facebook twitter linkedin kakaostory
저자
고우리, 장민수, 이재연, 김재홍
발행일
202104
출처
The International Journal of Robotics Research, v.40 no.4-5, pp.691-697
ISSN
0278-3649
출판사
SAGE
DOI
https://dx.doi.org/10.1177/0278364921990671
협약과제
19HS6200, 고령 사회에 대응하기 위한 실환경 휴먼케어 로봇 기술 개발, 이재연
초록
To better interact with users, a social robot should understand the users?? behavior, infer the intention, and respond appropriately. Machine learning is one way of implementing robot intelligence. It provides the ability to automatically learn and improve from experience instead of explicitly telling the robot what to do. Social skills can also be learned through watching human?밾uman interaction videos. However, human?밾uman interaction datasets are relatively scarce to learn interactions that occur in various situations. Moreover, we aim to use service robots in the elderly care domain; however, there has been no interaction dataset collected for this domain. For this reason, we introduce a human?밾uman interaction dataset for teaching non-verbal social behaviors to robots. It is the only interaction dataset that elderly people have participated in as performers. We recruited 100 elderly people and 2 college students to perform 10 interactions in an indoor environment. The entire dataset has 5,000 interaction samples, each of which contains depth maps, body indexes, and 3D skeletal data that are captured with three Microsoft Kinect v2 sensors. In addition, we provide the joint angles of a humanoid NAO robot which are converted from the human behavior that robots need to learn. The dataset and useful Python scripts are available for download at https://github.com/ai4r/AIR-Act2Act. It can be used to not only teach social skills to robots but also benchmark action recognition algorithms.
KSP 제안 키워드
Action recognition, College students, Depth Map, Elderly Care, Elderly People, Human-Human Interaction, Indoor Environment, Joint angles, Microsoft Kinect V2, Nao robot, Non-verbal