Shape from Blur: Recovering Textured 3D Shape and Motion of Fast Moving Objects
About
We address the novel task of jointly reconstructing the 3D shape, texture, and motion of an object from a single motion-blurred image. While previous approaches address the deblurring problem only in the 2D image domain, our proposed rigorous modeling of all object properties in the 3D domain enables the correct description of arbitrary object motion. This leads to significantly better image decomposition and sharper deblurring results. We model the observed appearance of a motion-blurred object as a combination of the background and a 3D object with constant translation and rotation. Our method minimizes a loss on reconstructing the input image via differentiable rendering with suitable regularizers. This enables estimating the textured 3D mesh of the blurred object with high fidelity. Our method substantially outperforms competing approaches on several benchmarks for fast moving objects deblurring. Qualitative results show that the reconstructed 3D mesh generates high-quality temporal super-resolution and novel views of the deblurred object.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Fast moving object deblurring | Falling Objects | PSNR27.18 | 7 | |
| Fast moving object deblurring | TbD-3D Dataset | PSNR26.54 | 7 | |
| Fast moving object deblurring | TbD Dataset | PSNR25.66 | 7 | |
| 3D reconstruction of fast moving objects | Synthetic dataset (at most 90° rotation over 3 frames) large rotation | Translational Error37.8 | 2 | |
| 3D reconstruction of fast moving objects | Synthetic dataset at most 30° rotation over 3 frames (small rotation) | Translational Error12.8 | 2 |