TL;DR: FastGS is a new, simple, and general acceleration framework for 3D Gaussian Splatting (3DGS). It enables training a scene in about 100 seconds while maintaining comparable rendering quality.
The core idea is to combine multi-view consistent densification and targeted pruning, which together reduce unnecessary computation while preserving visual fidelity.
We demonstrate how FastGS accelerates the training process of different backbones on various scenes.
FastGS surpasses state-of-the-art Gaussian-based rendering methods in both reconstruction quality and training speed.
Integrating FastGS into various baseline architectures significantly accelerates their training process.
The FastGS pipeline rethinks importance estimation for Gaussians using multi-view consistency. We sample training views and build per-pixel L1 loss maps to evaluate how each Gaussian contributes across views.
Key components:
Other SOTA 3DGS methods:
The figure shows densification (from 0.5K to 15K iterations) on the left and pruning results (achieved using Speedy-Splat's pruning strategy and VCP on vanilla 3DGS) on the right, illustrating how FastGS preserves quality while reducing compute.
Quantitative comparisons show that FastGS completes 3DGS training in about 100 seconds on average while keeping rendering quality comparable to prior fast-optimization methods.