Parameter identification for PDEs using sparse interior data and a recurrent neural network

Research output: Contribution to journalArticlepeer-review

Abstract

Physics-informed neural networks have proven to be a powerful approach for addressing both forward and inverse problems by integrating the governing equations’ residuals and data constraints within the loss function. However, their performance significantly declines when interior data is sparse. In this study, we propose a new approach to address this issue by combining the Gated Recurrent Units with an implicit numerical method. First, the input is fed into the neural network to produce an initial solution approximation over the entire domain. Next, an implicit numerical method is employed to simulate the time iteration scheme based on these approximate solutions, wherein the unknown parameters of the partial differential equations are initially assigned random values. In this approach, the physical constraints are integrated into the time iteration scheme, allowing us to formulate mean square errors between the iteration scheme and the neural network’s approximate solutions. Furthermore, mean square errors comparing sparse interior data points and the network’s corresponding predictions are incorporated into the loss function. By minimizing this combined loss, the unknown parameters are identified, and the complete solution is obtained. The algorithm’s effectiveness is demonstrated in various numerical experiments, such as Burgers’ equation, Allen–Cahn equation, and non-linear Schrödinger equation.

Original languageEnglish
Article number33828
JournalScientific Reports
Volume15
Issue number1
DOIs
StatePublished - Dec 2025

Bibliographical note

Publisher Copyright:
© The Author(s) 2025.

Keywords

  • Deep learning
  • Hybrid model
  • Inverse problem
  • Sparse data
  • Time-dependent PDEs

ASJC Scopus subject areas

  • General

Fingerprint

Dive into the research topics of 'Parameter identification for PDEs using sparse interior data and a recurrent neural network'. Together they form a unique fingerprint.

Cite this