Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Convolutional Occupancy Networks

About

Recently, implicit neural representations have gained popularity for learning-based 3D reconstruction. While demonstrating promising results, most implicit approaches are limited to comparably simple geometry of single objects and do not scale to more complicated or large-scale scenes. The key limiting factor of implicit methods is their simple fully-connected network architecture which does not allow for integrating local information in the observations or incorporating inductive biases such as translational equivariance. In this paper, we propose Convolutional Occupancy Networks, a more flexible implicit representation for detailed reconstruction of objects and 3D scenes. By combining convolutional encoders with implicit occupancy decoders, our model incorporates inductive biases, enabling structured reasoning in 3D space. We investigate the effectiveness of the proposed representation by reconstructing complex geometry from noisy point clouds and low-resolution voxel representations. We empirically find that our method enables the fine-grained implicit 3D reconstruction of single objects, scales to large indoor scenes, and generalizes well from synthetic to real data.

Songyou Peng, Michael Niemeyer, Lars Mescheder, Marc Pollefeys, Andreas Geiger• 2020

Related benchmarks

TaskDatasetResultRank
3D ReconstructionShapeNet (test)--
74
3D Geometry ReconstructionScanNet--
54
3D surface reconstructionShapeNet 12 (test)
CD10.44
24
Object-level 3D ReconstructionShapeNet 13 classes (test)
Chamfer-L1 Distance0.043
21
Scene-level 3D ReconstructionScanNet (test)
F-score60
20
Scene-level reconstructionsynthetic indoor scene dataset
IoU81.6
14
3D Object ReconstructionShapeNet
IoU (airplane)0.881
11
Surface ReconstructionShapeNet (test)
CDL10.044
11
Shape ReconstructionShapeNet 55 categories v2 (test)
IoU88.8
11
Shape ReconstructionShapeNet 7 selected categories v2 (test)
IoU0.881
11
Showing 10 of 46 rows

Other info

Follow for update