ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Journal Article 상호작용 영상 주석 기반 사용자 참여도 및 의도 인식
Cited - time in scopus Share share facebook twitter linkedin kakaostory
Authors
장민수, 박천수, 이대하, 김재홍, 조영조
Issue Date
2014-06
Citation
제어로봇시스템학회논문지, v.20, no.6, pp.612-618
ISSN
1976-5622
Publisher
제어로봇시스템학회
Language
Korean
Type
Journal Article
Abstract
A pattern classifier-based approach for recognizing internal states of human participants in interactions is presented along with its experimental results. The approach includes a step for collecting video recordings of human-human interactions or human-robot interactions and subsequently analyzing the videos based on human coded annotations. The annotation includes social signals directly observed in the video recordings and the internal states of human participants indirectly inferred from those observed social signals. Then, a pattern classifier is trained using the annotation data, and tested. In our experiments on human-robot interaction, 7 video recordings were collected and annotated with 20 social signals and 7 internal states. Several experiments were performed to obtain an 84.83% recall rate for interaction engagement, 93% for concentration intention, and 81% for task comprehension level using a C4.5 based decision tree classifier.
KSP Keywords
Annotation data, Based Approach, Decision Tree(DT), Decision Tree Classifier, Human-Robot Interaction(HRI), Recall rate, Social Signals, Video recording, human interaction, internal states, pattern classifier