Application of deep residual networks to predict the effective properties of fiber-reinforced composites with voids

A novel deep-learning method is adopted to predict effective mechanical properties of epoxy-based fiber-reinforced composites. In order to generate mechanical properties and image data for learning, appropriate RVEs together with periodic boundary conditions are used in FEMs. Using a random algorith...

Full description

Saved in:
Bibliographic Details
Main Authors: Mahdi Karimian, Seyed Ali Hosseini Kordkheili
Format: Article
Language:English
Published: SAGE Publishing 2025-01-01
Series:Advances in Mechanical Engineering
Online Access:https://doi.org/10.1177/16878132251315871
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:A novel deep-learning method is adopted to predict effective mechanical properties of epoxy-based fiber-reinforced composites. In order to generate mechanical properties and image data for learning, appropriate RVEs together with periodic boundary conditions are used in FEMs. Using a random algorithm in RVEs, voids and fiber with different diameters are generated; the resulted composites consist of void and fiber volume fractions in the range of 0.00–0.03 and 0.40–0.65, respectively. To train, four different CNN (i.e. from a simple to deeper one) together with MSE loss function are used to increase the accuracy. The SGD with momentum and weight decay is used to minimize the loss function. Each of these four models is trained on each considered material, both separately and simultaneously. The performance and accuracy of these models on the train, valid, and test data are compared together. According to the results, ResNet model leads to the best results. It is noted that, all properties have an accuracy greater than 98.79%. The margin for all properties of carbon fibers is less than 4.3% and for glass fiber is less than 7.3%. Also, it is noted that, the proposed model has a good performance to predict mechanical properties with lesser computational cost.
ISSN:1687-8140