Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Mip-NeRF: A Multiscale Representation for Anti-Aliasing Neural Radiance Fields

About

The rendering procedure used by neural radiance fields (NeRF) samples a scene with a single ray per pixel and may therefore produce renderings that are excessively blurred or aliased when training or testing images observe scene content at different resolutions. The straightforward solution of supersampling by rendering with multiple rays per pixel is impractical for NeRF, because rendering each ray requires querying a multilayer perceptron hundreds of times. Our solution, which we call "mip-NeRF" (a la "mipmap"), extends NeRF to represent the scene at a continuously-valued scale. By efficiently rendering anti-aliased conical frustums instead of rays, mip-NeRF reduces objectionable aliasing artifacts and significantly improves NeRF's ability to represent fine details, while also being 7% faster than NeRF and half the size. Compared to NeRF, mip-NeRF reduces average error rates by 17% on the dataset presented with NeRF and by 60% on a challenging multiscale variant of that dataset that we present. Mip-NeRF is also able to match the accuracy of a brute-force supersampled NeRF on our multiscale dataset while being 22x faster.

Jonathan T. Barron, Ben Mildenhall, Matthew Tancik, Peter Hedman, Ricardo Martin-Brualla, Pratul P. Srinivasan• 2021

Related benchmarks

TaskDatasetResultRank
Novel View SynthesisTanks&Temples (test)
PSNR22.22
239
Novel View SynthesisMip-NeRF 360 (test)
PSNR27.69
166
Novel View SynthesisLLFF
PSNR26.6
124
Novel View SynthesisMipNeRF 360 Outdoor
PSNR24.47
112
Novel View SynthesisMipNeRF 360 Indoor
PSNR31.72
108
Novel View SynthesisMip-NeRF360
PSNR29.23
104
Novel View SynthesisDTU
PSNR8.68
100
Novel View SynthesisLLFF 3-view
PSNR16.11
95
Novel View SynthesisNeRF Synthetic
PSNR32.63
92
Novel View SynthesisDTU (test)
PSNR8.68
82
Showing 10 of 106 rows
...

Other info

Follow for update