Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Meta-Continual Learning of Neural Fields

About

Neural Fields (NF) have gained prominence as a versatile framework for complex data representation. This work unveils a new problem setting termed \emph{Meta-Continual Learning of Neural Fields} (MCL-NF) and introduces a novel strategy that employs a modular architecture combined with optimization-based meta-learning. Focused on overcoming the limitations of existing methods for continual learning of neural fields, such as catastrophic forgetting and slow convergence, our strategy achieves high-quality reconstruction with significantly improved learning speed. We further introduce Fisher Information Maximization loss for neural radiance fields (FIM-NeRF), which maximizes information gains at the sample level to enhance learning generalization, with proved convergence guarantee and generalization bound. We perform extensive evaluations across image, audio, video reconstruction, and view synthesis tasks on six diverse datasets, demonstrating our method's superiority in reconstruction quality and speed over existing MCL and CL-NF approaches. Notably, our approach attains rapid adaptation of neural fields for city-scale NeRF rendering with reduced parameter requirement. Code is available at https://github.com/seungyoon-woo/mcl-nf.

Seungyoon Woo, Junhyeog Yun, Gunhee Kim• 2025

Related benchmarks

TaskDatasetResultRank
Audio ReconstructionLibriSpeech 1
PESQ3.73
27
Video ReconstructionVoxCeleb2
SSIM0.996
27
Image ReconstructionCelebA
SSIM0.995
27
Image ReconstructionImagenette
SSIM0.996
27
Neural Field ReconstructionImagenette
PSNR (Step 1)25.18
9
Neural Field ReconstructionFFHQ
PSNR (Step 1)27.2
9
Neural Field ReconstructionVoxCeleb2
PSNR (Step 1)29.84
9
Neural Field ReconstructionCelebA
PSNR (Step 1)29.29
9
Neural Field ReconstructionLibriSpeech 1
PSNR (Step 1)42.09
9
Neural Field ReconstructionLibriSpeech 3
PSNR (Step 1)38.84
9
Showing 10 of 12 rows

Other info

Follow for update