ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술대회 A Human Activity Recognition-Aware Framework Using Multi-modal Sensor Data Fusion
Cited 0 time in scopus Download 4 time Share share facebook twitter linkedin kakaostory
저자
권은정, 박현호, 변성원, 정의석, 이용태
발행일
201801
출처
International Conference on Consumer Electronics (ICCE) 2018, pp.1-2
DOI
https://dx.doi.org/10.1109/ICCE.2018.8326109
협약과제
18HR1800, 다중로그 기반 멀티모달 데이터융합 분석 및 상황 대응 플랫폼 기술 개발, 이용태
초록
Recent years, with smartphones and other pervasive devices the paradigm of situation-recognition extended to IoT devices in home. Since IoT devices produce data that can help predict accidents or disasters in private or public environments. In IoT devices provided situation, machine learning technologies can help make an insight to what's really meaningful according to service provider's purpose. So, in order to understand several kinds of user's situation within an environment with IoT devices, a novel human activity recognition scheme is required to manage lots of data for guaranteeing accurate situation in real-time. As the accuracy of analyzing result and the response time of informing a situational corresponding to users are important factors on providing services, we present a human activity recognition-aware framework using multi-modal sensors connected with IoT devices and consumer devices in home.
KSP 제안 키워드
Consumer devices, Human activity recognition(HAR), IoT Devices, Machine learning technologies, Multimodal sensor, Pervasive devices, Real-Time, Sensor data fusion, Service Provider, response time