Physics-Informed Neural Networks: A Review of Methodological Evolution, Theoretical Foundations, and Interdisciplinary Frontiers Toward Next-Generation Scientific Computing

Physics-informed neural networks (PINNs) have emerged as a transformative methodology integrating deep learning with scientific computing. This review establishes a three-dimensional analytical framework to systematically decode PINNs’ development through methodological innovation, theoretical break...

Full description

Saved in:
Bibliographic Details
Main Authors: Zhiyuan Ren, Shijie Zhou, Dong Liu, Qihe Liu
Format: Article
Language:English
Published: MDPI AG 2025-07-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/15/14/8092
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Physics-informed neural networks (PINNs) have emerged as a transformative methodology integrating deep learning with scientific computing. This review establishes a three-dimensional analytical framework to systematically decode PINNs’ development through methodological innovation, theoretical breakthroughs, and cross-disciplinary convergence. The contributions include threefold: First, identifying the co-evolutionary path of algorithmic architectures from adaptive optimization (neural tangent kernel-guided weighting achieving 230% convergence acceleration in Navier-Stokes solutions) to hybrid numerical-deep learning integration (5× speedup via domain decomposition) and second, constructing bidirectional theory-application mappings where convergence analysis (operator approximation theory) and generalization guarantees (Bayesian-physical hybrid frameworks) directly inform engineering implementations, as validated by <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>72</mn><mo>%</mo></mrow></semantics></math></inline-formula> cost reduction compared to FEM in high-dimensional spaces (<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mi>p</mi><mo><</mo><mn>0.01</mn><mo>,</mo><mi>n</mi><mo>=</mo><mn>15</mn></mrow></semantics></math></inline-formula> benchmarks). Third, pioneering cross-domain knowledge transfer through application-specific architectures: TFE-PINN for turbulent flows (<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>5.12</mn><mo>±</mo><mn>0.87</mn><mo>%</mo></mrow></semantics></math></inline-formula> error in NASA hypersonic tests), ReconPINN for medical imaging (<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mo>▵</mo><mi>SSIM</mi><mo>=</mo><mo>+</mo><mn>0.18</mn><mo>±</mo><mn>0.04</mn></mrow></semantics></math></inline-formula> on multi-institutional MRI), and SeisPINN for seismic systems (<inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>0.52</mn><mo>±</mo><mn>0.18</mn></mrow></semantics></math></inline-formula> km localization accuracy). We further present a technological roadmap highlighting three critical directions for PINN 2.0: neuro-symbolic, federated physics learning, and quantum-accelerated optimization. This work provides methodological guidelines and theoretical foundations for next-generation scientific machine learning systems.
ISSN:2076-3417