Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Inducing Point Operator Transformer: A Flexible and Scalable Architecture for Solving PDEs

About

Solving partial differential equations (PDEs) by learning the solution operators has emerged as an attractive alternative to traditional numerical methods. However, implementing such architectures presents two main challenges: flexibility in handling irregular and arbitrary input and output formats and scalability to large discretizations. Most existing architectures are limited by their desired structure or infeasible to scale large inputs and outputs. To address these issues, we introduce an attention-based model called an inducing-point operator transformer (IPOT). Inspired by inducing points methods, IPOT is designed to handle any input function and output query while capturing global interactions in a computationally efficient way. By detaching the inputs/outputs discretizations from the processor with a smaller latent bottleneck, IPOT offers flexibility in processing arbitrary discretizations and scales linearly with the size of inputs/outputs. Our experimental results demonstrate that IPOT achieves strong performances with manageable computational complexity on an extensive range of PDE benchmarks and real-world weather forecasting scenarios, compared to state-of-the-art methods.

Seungjun Lee, Taeil Oh• 2023

Related benchmarks

TaskDatasetResultRank
Forward PDE solvingPlasticity
Relative L2 Error0.0033
21
Forward PDE solvingAirfoil
Relative L20.88
21
Forward PDE solvingElasticity
Relative L2 Error0.0156
19
PDE solvingNavier-Stokes Point-wise (25% test ratio)
Relative L2 Error0.2526
15
PDE solvingERA5 Patch-wise 50% test ratio
Rel L2 Error0.0389
12
PDE solvingNavier-Stokes Point-wise 5% ratio (test)
Relative L2 Error0.2528
8
PDE solvingDiffusion-Reaction Point-wise (test)
Rel L2 Error0.023
7
Showing 7 of 7 rows

Other info

Follow for update