MrNeRF (@janusch_patas)
2025-10-23 | โค๏ธ 187 | ๐ 20
MoE-GS: Mixture of Experts for Dynamic Gaussian Splatting
Contributions: โข MoE-GS: the first dynamic Gaussian splatting framework employing a Mixture-of-Experts architecture, enabling robust and adaptive reconstruction across diverse dynamic scenes.
โข A novel Volume-aware Pixel Router integrates expert outputs through differentiable weight splatting, achieving spatially and temporally coherent adaptive blending.
โข Efficiency of MoE-GS is improved through single-pass multi-expert rendering and gate-aware Gaussian pruning. A separate knowledge distillation strategy trains individual experts with pseudo-labels from the MoE model, enhancing quality without modifying the architecture.
๐ Related
Auto-generated bookmark