A deep neural network with attention mechanism for flow prediction of compressor blade

Abstract For flow-related design optimization problems, computational fluid dynamics (CFD) simulations are commonly used to predict the flow fields. However, the computational expenses of CFD simulations limit the opportunities for design exploration. Motivated by this tricky issue, a convolutional...

Full description

Saved in:
Bibliographic Details
Main Authors: Guanyu Gao, Gang Wang
Format: Article
Language:English
Published: Nature Portfolio 2025-05-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-025-99688-0
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract For flow-related design optimization problems, computational fluid dynamics (CFD) simulations are commonly used to predict the flow fields. However, the computational expenses of CFD simulations limit the opportunities for design exploration. Motivated by this tricky issue, a convolutional neural network (CNN) based on U-Net architecture with attention mechanism (AM) is proposed to efficiently learn flow representations from CFD results to shorten the compressor blade design cycle. The proposed model converts the provided shape information and flow conditions into grayscale images to directly predict the expected flow field, saving computational time. An extensive hyper-parameter search is performed to determine the optimal model. Qualitative and quantitative analysis of the results are studied to evaluate the accuracy for the calculation of Mach number distributions. In particular, two new attention mechanisms is developed to preserve the physical consistency of the complex flow field with shock wave. Mach number flow fields under different working conditions are predicted using the proposed model, and the prediction is well consistent with CFD results. Over three orders of magnitude of speedup is achieved at all batch sizes compared to traditional CFD methods, while maintaining low prediction errors.
ISSN:2045-2322