Neural-IMLS: Learning Implicit Moving Least-Squares for Surface Reconstruction from Unoriented Point Clouds
Surface reconstruction from noisy, non-uniform, and unoriented point clouds is a fascinating yet challenging problem in computer vision and graphics. With the innovations of 3D scanning technology, it is highly desired to directly transform raw scan data, typically with severe noise, into a manifold triangle mesh. Existing learning-based approaches aim at learning an implicit function whose zero-level surface encodes the underlying shape. However, most of them cannot obtain desirable results for noisy and sparse point clouds, limiting use in practice. In this paper, we introduce Neural-IMLS, a novel approach that learns the noise-resistant signed distance function (SDF) directly from unoriented raw point clouds. Instead of explicitly learning priors with the ground-truth signed distance values, our method learns the underlying SDF from raw point clouds in a self-supervised fashion by minimizing the loss between a couple of SDFs, one obtained by the implicit moving least-square function (IMLS) and the other by our neural network, where the gradients of our predictor define the tangent bundle that facilitates the computation of IMLS. We prove that when the couple of SDFs coincide, our neural network can predict a signed implicit function whose zero level-set serves as a good approximation to the underlying surface. We conduct extensive experiments on various benchmarks, including synthetic scans and real-world scans, to exhibit the ability to reconstruct faithful shapes from various inputs, especially for point clouds with noise or gaps.
READ FULL TEXT