Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Unlearning Noise in PINNs: A Selective Pruning Framework for PDE Inverse Problems

About

Physics-informed neural networks (PINNs) provide a promising framework for solving inverse problems governed by partial differential equations (PDEs) by integrating observational data and physical constraints in a unified optimization objective. However, the ill-posed nature of PDE inverse problems makes them highly sensitive to noise. Even a small fraction of corrupted observations can distort internal neural representations, severely impairing accuracy and destabilizing training. Motivated by recent advances in machine unlearning and structured network pruning, we propose P-PINN, a selective pruning framework designed to unlearn the influence of corrupted data in a pretrained PINN. Specifically, starting from a PINN trained on the full dataset, P-PINN evaluates a joint residual--data fidelity indicator, a weighted combination of data misfit and PDE residuals, to partition the training set into reliable and corrupted subsets. Next, we introduce a bias-based neuron importance measure that quantifies directional activation discrepancies between the two subsets, identifying neurons whose representations are predominantly driven by corrupted samples. Building on this, an iterative pruning strategy then removes noise-sensitive neurons layer by layer. The resulting pruned network is fine-tuned on the reliable data subject to the original PDE constraints, acting as a lightweight post-processing stage rather than a complete retraining. Numerical experiments on extensive PDE inverse-problem benchmarks demonstrate that P-PINN substantially improves robustness, accuracy, and training stability under noisy conditions, achieving up to a 96.6\% reduction in relative error compared with baseline PINNs. These results indicate that activation-level post hoc pruning is a promising mechanism for enhancing the reliability of physics-informed learning in noise-contaminated settings.

Yongsheng Chen, Yong Chen, Wei Guo, Xinghui Zhong• 2026

Related benchmarks

TaskDatasetResultRank
Data AssimilationPoisson
fMSE-L111
4
Data AssimilationHeat
fMSE-L16.6
4
Data AssimilationWave
fMSE-L15.4
4
Data AssimilationStokes
fMSE-L112
4
Data AssimilationHeat
MSE6.43e-4
4
Data AssimilationWave
MSE1.57e-4
4
Data AssimilationHeat
L1RE1.35
4
Data AssimilationWave
L1RE0.025
4
Data AssimilationStokes
L1RE0.117
4
Data AssimilationPoisson
L2RE0.0804
4
Showing 10 of 52 rows

Other info

Follow for update