ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술지 상호 작용 영상 주석 기반 사용자 참여도 및 의도 인식
Cited - time in scopus Download 2 time Share share facebook twitter linkedin kakaostory
저자
장민수, 박천수, 이대하, 김재홍, 조영조
발행일
201406
출처
제어로봇시스템학회논문지, v.20 no.6, pp.612-618
ISSN
1976-5622
출판사
제어로봇시스템학회
협약과제
14PC1800, 인간 친화적 로봇 서비스 환경에서 판단 적합성 90%이상인 복합지식 기반 판단 및 의미기반 로봇 표현 기술 개발, 김재홍
초록
A pattern classifier-based approach for recognizing internal states of human participants in interactions is presented along with its experimental results. The approach includes a step for collecting video recordings of human-human interactions or human-robot interactions and subsequently analyzing the videos based on human coded annotations. The annotation includes social signals directly observed in the video recordings and the internal states of human participants indirectly inferred from those observed social signals. Then, a pattern classifier is trained using the annotation data, and tested. In our experiments on human-robot interaction, 7 video recordings were collected and annotated with 20 social signals and 7 internal states. Several experiments were performed to obtain an 84.83% recall rate for interaction engagement, 93% for concentration intention, and 81% for task comprehension level using a C4.5 based decision tree classifier.
KSP 제안 키워드
Annotation data, Based Approach, Decision Tree(DT), Decision Tree Classifier, Human-Robot Interaction(HRI), Recall rate, Social Signals, Video recording, human interaction, internal states, pattern classifier