Neural delay differential equations: learning non-Markovian closures for partially known dynamical systems
About
Recent advances in learning dynamical systems from data have shown significant promise. However, many existing methods assume access to the full state of the system -- an assumption that is rarely satisfied in practice, where systems are typically monitored through a limited number of sensors, leading to partial observability. To address this challenge, we draw inspiration from the Mori-Zwanzig formalism, which provides a theoretical connection between hidden variables and memory terms. Motivated by this perspective, we introduce a constant-lag Neural Delay Differential Equations (NDDEs) framework, providing a continuous-time approach for learning non-Markovian dynamics directly from data. These memory effects are captured using a finite set of time delays, which are identified via the adjoint method. We validate the proposed approach on a range of datasets, including synthetic systems, chaotic dynamics, and experimental measurements, such as the Kuramoto-Sivashinsky equation and cavity-flow experiments. Results demonstrate that NDDEs compare favourably with existing approaches for partially observed systems, including long short-term memory (LSTM) networks and augmented neural ordinary differential equations (ANODEs). Overall, NDDEs offer a principled and data-efficient framework for modelling non-Markovian dynamics under partial observability. An open-source implementation accompanies this article.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Closure Modeling | Kuramoto-Sivashinsky (KS) system (test) | Test MSE Loss0.067 | 9 | |
| Time-series prediction | KS (test) | MSE0.28 | 5 | |
| Time-series prediction | Cavity (test) | MSE0.13 | 5 | |
| Maximum Lyapunov exponent estimation | KS system (test) | Max Lyapunov Exponent (lambda_max)0.128 | 5 | |
| Time-series prediction | Brusselator (test) | MSE0.016 | 5 |