Our method reconstructs scenes with 2DGS in a feedforward manner, and achieves high geometric accuracy and perceptual quality, and supports further mesh extraction.
Reconstructing 3D scenes from sparse images remains a challenging task due to the difficulty of recovering accurate geometry and texture without optimization. Recent approaches leverage generalizable models to generate 3D scenes using 3D Gaussian Splatting (3DGS) primitive. However, they often fail to produce continuous surfaces and instead yield discrete, color-biased point clouds that appear plausible at normal resolution but reveal severe artifacts under close-up views.
To address this issue, we present SurfSplat, a feedforward framework based on 2D Gaussian Splatting (2DGS) primitive, which provides stronger anisotropy and higher geometric precision. By incorporating a surface continuity prior and a forced alpha blending strategy, SurfSplat reconstructs coherent geometry together with faithful textures.
Furthermore, we introduce High-Resolution Rendering Consistency (HRRC), a new evaluation metric designed to evaluate high-resolution reconstruction quality. Extensive experiments on RealEstate10K, DL3DV, and ScanNet demonstrate that SurfSplat consistently outperforms prior methods on both standard metrics and HRRC, establishing a robust solution for high-fidelity 3D reconstruction from sparse inputs.
@misc{he2026surfsplatconqueringfeedforward2d,
title={SurfSplat: Conquering Feedforward 2D Gaussian Splatting with Surface Continuity Priors},
author={Bing He and Jingnan Gao and Yunuo Chen and Ning Cao and Gang Chen and Zhengxue Cheng and Li Song and Wenjun Zhang},
year={2026},
eprint={2602.02000},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2602.02000},
}