ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Conference Paper Compensating Spatiotemporally Inconsistent Observations for Online Dynamic 3D Gaussian Splatting
Cited 0 time in scopus Download 45 time Share share facebook twitter linkedin kakaostory
Authors
Youngsik Yun, Jeongmin Bae, Hyunseung Son, Seoha Kim, Hahyun Lee, Gun Bang, Youngjung Uh
Issue Date
2025-08
Citation
ACM SIGGRAPH 2025, pp.1-9
Language
English
Type
Conference Paper
DOI
https://dx.doi.org/10.1145/3721238.3730678
Abstract
Online reconstruction of dynamic scenes is significant as it enables learning scenes from live-streaming video inputs, while existing offline dynamic reconstruction methods rely on recorded video inputs. However, previous online reconstruction approaches have primarily focused on efficiency and rendering quality, overlooking the temporal consistency of their results, which often contain noticeable artifacts in static regions. This paper identifies that errors such as noise in real-world recordings affect temporal inconsistency in online reconstruction. We propose a method that enhances temporal consistency in online reconstruction from observations with temporal inconsistency which is inevitable in cameras. We show that our method restores the ideal observation by subtracting the learned error. We demonstrate that applying our method to various baselines significantly enhances both temporal consistency and rendering quality across datasets. Code, video results, and checkpoints are available at https://bbangsik13.github.io/OR2.
KSP Keywords
Dynamic Reconstruction, Real-world, Reconstruction method, Streaming video, dynamic scenes, live streaming, temporal consistency
This work is distributed under the term of Creative Commons License (CCL)
(CC BY)
CC BY