Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

DiverseDepth: Affine-invariant Depth Prediction Using Diverse Data

About

We present a method for depth estimation with monocular images, which can predict high-quality depth on diverse scenes up to an affine transformation, thus preserving accurate shapes of a scene. Previous methods that predict metric depth often work well only for a specific scene. In contrast, learning relative depth (information of being closer or further) can enjoy better generalization, with the price of failing to recover the accurate geometric shape of the scene. In this work, we propose a dataset and methods to tackle this dilemma, aiming to predict accurate depth up to an affine transformation with good generalization to diverse scenes. First we construct a large-scale and diverse dataset, termed Diverse Scene Depth dataset (DiverseDepth), which has a broad range of scenes and foreground contents. Compared with previous learning objectives, i.e., learning metric depth or relative depth, we propose to learn the affine-invariant depth using our diverse dataset to ensure both generalization and high-quality geometric shapes of scenes. Furthermore, in order to train the model on the complex dataset effectively, we propose a multi-curriculum learning method. Experiments show that our method outperforms previous methods on 8 datasets by a large margin with the zero-shot test setting, demonstrating the excellent generalization capacity of the learned model to diverse scenes. The reconstructed point clouds with the predicted depth show that our method can recover high-quality 3D shapes. Code and dataset are available at: https://tinyurl.com/DiverseDepth

Wei Yin, Xinlong Wang, Chunhua Shen, Yifan Liu, Zhi Tian, Songcen Xu, Changming Sun, Dou Renyin• 2020

Related benchmarks

TaskDatasetResultRank
Monocular Depth EstimationKITTI (Eigen)
Abs Rel19
502
Depth EstimationNYU v2 (test)
Threshold Accuracy (delta < 1.25)70.4
423
Monocular Depth EstimationNYU v2 (test)
Abs Rel11.7
257
Monocular Depth EstimationKITTI
Abs Rel0.19
161
Monocular Depth EstimationETH3D
AbsRel22.8
117
Monocular Depth EstimationNYU V2
Delta 1 Acc87.5
113
Depth EstimationScanNet
AbsRel0.109
94
Monocular Depth EstimationDIODE
AbsRel37.6
93
Depth EstimationKITTI
AbsRel0.19
92
Depth EstimationScanNet (test)
Abs Rel0.109
65
Showing 10 of 30 rows

Other info

Follow for update