ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper 3D Mesh Transformation System using Multi-Object Tracking for Augmented Reality Services
Cited 0 time in scopus Share share facebook twitter linkedin kakaostory
Authors
Young-Suk Yoon, Sangwon Hwang, Heansung Lee, Sangyoun Lee
Issue Date
2021-10
Citation
International Conference on Information and Communication Technology Convergence (ICTC) 2021, pp.1558-1563
Publisher
IEEE
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1109/ICTC52510.2021.9620779
Abstract
In this paper, we propose a system that transforms the 3D space of the real world into a 3D mesh to provide augmented reality(AR) services. First, the proposed system acquires a 2D RGB image sequence using a monocular camera in order to model the real world in 3D data. and it creates a 3D point cloud with 3D spatial information of the real world using the obtained RGB image sequence. Also, it segments objects that appear in the RGB image sequence into individual objects. Next, the proposed system tracks the segmented multiple objects and connects identification information of the objects respectively. and it uses both the connected information and the 3D point cloud to perform a 3D point labeling in which object information is assigned to each 3D point. Then, it collects 3D points with the same label for each object. Finally, it converts the 3D points collected for each object into 3D mesh. It was confirmed that the system proposed in this paper can generate a 3D mesh with 2D RGB image sequences obtained with a general monocular camera to provide interactive AR services.
KSP Keywords
3D Mesh, 3D data, 3D point cloud, 3D space, 3D spatial, Augmented reality(AR), Image sequence, Interactive AR, Monocular Camera, Multiple objects, Point labeling