ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper 3D Human Pose Estimation Using Egocentric Depth Data
Cited 0 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Seongmin Baek, Youn-Hee Gil, Yejin Kim
Issue Date
2024-10
Citation
Symposium on Virtual Reality Software and Technology (VRST) 2024, pp.1-2
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1145/3641825.3689515
Abstract
In this paper, we present a novel approach for 3D human pose estimation using depth data from egocentric viewpoints. Depth data has the advantage that it is less sensitive to color and lighting changes. We acquired depth data streamed from multiple depth cameras attached to a user’s head and calibrated them into a depth map. For joint detection, a ResNet-based network was optimized with the skeletal joints of a Kinect camera. Unlike previous approaches, the proposed approach can track 3D human poses in an egocentric setup with a small dataset.
KSP Keywords
3D human pose estimation, Depth Data, Depth Map, Depth camera, Joint detection, Novel approach, Skeletal joints, Small dataset, kinect camera