Home Teaser Abstract Method Comparisons Links BibTex

Input Video Novel View-Time Synthesis
(Freeze Time & Change View)
Novel View-Time Synthesis
(Freeze View & Change Time)

Abstract

We introduce a novel method for dynamic free-view synthesis of an ambient scenes from a monocular capture bringing a immersive quality to the viewing experience. Our method builds upon the recent advancements in 3D Gaussian Splatting (3DGS) that can faithfully reconstruct complex static scenes. Previous attempts to extend 3DGS to represent dynamics have been confined to bounded scenes or require multi-camera captures, and often fail to generalize to unseen motions, limiting their practical application. Our approach overcomes these constraints by leveraging the periodicity of ambient motions to learn the motion trajectory model, coupled with careful regularization. We also propose important practical strategies to improve the visual quality of the baseline 3DGS static reconstructions and to improve memory efficiency critical for GPU-memory intensive learning. We demonstrate high-quality photorealistic novel view synthesis of several ambient natural scenes with intricate textures and fine structural elements.

Method Overview

Description of the image

Comparisons

Novel View-Time Synthesis (Freeze Time & Change View)
RoDynRF [Liu et al. 2023] 4D-GS [Wu et al. 2023] Ours

Novel View-Time Synthesis (Freeze View & Change Time)
RoDynRF [Liu et al. 2023] 4D-GS [Wu et al. 2023] Ours

BibTex

@inproceedings{ShihAmbGaus24,
  author = {Meng-Li Shih, Jia-Bin Huang, Changil Kim, Rajvi Shah, Johannes Kopf, Chen Gao},
  title = {Modeling Ambient Scene Dynamics for Free-view Synthesis},
  booktitle = {ACM SIGGRAPH},
  year = {2024}
}