Depth Any Panoramas: A Foundation Model for Panoramic Depth Estimation
About
In this work, we present a panoramic metric depth foundation model that generalizes across diverse scene distances. We explore a data-in-the-loop paradigm from the view of both data construction and framework design. We collect a large-scale dataset by combining public datasets, high-quality synthetic data from our UE5 simulator and text-to-image models, and real panoramic images from the web. To reduce domain gaps between indoor/outdoor and synthetic/real data, we introduce a three-stage pseudo-label curation pipeline to generate reliable ground truth for unlabeled images. For the model, we adopt DINOv3-Large as the backbone for its strong pre-trained generalization, and introduce a plug-and-play range mask head, sharpness-centric optimization, and geometry-centric optimization to improve robustness to varying distances and enforce geometric consistency across views. Experiments on multiple benchmarks (e.g., Stanford2D3D, Matterport3D, and Deep360) demonstrate strong performance and zero-shot generalization, with particularly robust and stable metric predictions in diverse real-world scenes. The project page can be found at: \href{https://insta360-research-team.github.io/DAP_website/} {https://insta360-research-team.github.io/DAP\_website/}
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Monocular Depth Estimation | Stanford2D3D (test) | δ1 Accuracy95.64 | 71 | |
| Depth Estimation | Matterport3D | delta185.18 | 35 | |
| Depth Estimation | Structure3D (test) | AbsRel0.0341 | 18 | |
| 360 Depth Estimation | Stanford2D3D 1.0 (test) | Abs Rel Error0.0921 | 14 | |
| Depth Estimation | Structured3D (val) | δ1 Accuracy89.18 | 9 | |
| Panoramic metric depth estimation | Matterport3D Indoor (test) | AbsRel0.1186 | 8 | |
| Panoramic metric depth estimation | DAP 1.0 (test) | AbsRel0.0781 | 3 | |
| Panoramic metric depth estimation | Deep360 Outdoor (test) | AbsRel0.0659 | 3 |