Beyond language: Applying MLX transformers to engineering physicsMLX Beyond Language

Transformer Neural Networks are driving an explosion of activity and discovery in the field of Large Language Models (LLMs). In contrast, there have been only a few attempts to apply Transformers in engineering physics. Aiming to offer an easy entry point to physics-centric Transformers, we introduc...

Full description

Saved in:
Bibliographic Details
Main Authors: Stavros Kassinos, Alessio Alexiadis
Format: Article
Language:English
Published: Elsevier 2025-06-01
Series:Results in Engineering
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2590123025009478
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850183112076558336
author Stavros Kassinos
Alessio Alexiadis
author_facet Stavros Kassinos
Alessio Alexiadis
author_sort Stavros Kassinos
collection DOAJ
description Transformer Neural Networks are driving an explosion of activity and discovery in the field of Large Language Models (LLMs). In contrast, there have been only a few attempts to apply Transformers in engineering physics. Aiming to offer an easy entry point to physics-centric Transformers, we introduce a physics-informed Transformer model for solving the heat conduction problem in a 2D plate with Dirichlet boundary conditions. The model is implemented in the machine learning framework MLX and leverages the unified memory of Apple M-series processors. The use of MLX means that the models can be trained and perform predictions efficiently on personal machines with only modest memory requirements. To train, validate and test the Transformer model we solve the 2D heat conduction problem using central finite differences. Each finite difference solution in these sets is initialized with four random Dirichlet boundary conditions, a uniform but random internal temperature distribution and a randomly selected thermal diffusivity. Validation is performed in-line during training to monitor against over-fitting. The excellent performance of the trained model is demonstrated by predicting the evolution of the temperature field to steady state for the unseen test set of conditions.
format Article
id doaj-art-902395872dcc41c090a1da1ef6fd4621
institution OA Journals
issn 2590-1230
language English
publishDate 2025-06-01
publisher Elsevier
record_format Article
series Results in Engineering
spelling doaj-art-902395872dcc41c090a1da1ef6fd46212025-08-20T02:17:27ZengElsevierResults in Engineering2590-12302025-06-012610487110.1016/j.rineng.2025.104871Beyond language: Applying MLX transformers to engineering physicsMLX Beyond LanguageStavros Kassinos0Alessio Alexiadis1Computational Sciences Laboratory, Department of Mechanical Engineering, University of Cyprus, 1 University Avenue, Aglantzia, 2109, Nicosia, Cyprus; Corresponding author.School of Chemical Engineering, University of Birmingham, Edgbaston, Birmingham, B15 2TT, 22222, State Two, United KingdomTransformer Neural Networks are driving an explosion of activity and discovery in the field of Large Language Models (LLMs). In contrast, there have been only a few attempts to apply Transformers in engineering physics. Aiming to offer an easy entry point to physics-centric Transformers, we introduce a physics-informed Transformer model for solving the heat conduction problem in a 2D plate with Dirichlet boundary conditions. The model is implemented in the machine learning framework MLX and leverages the unified memory of Apple M-series processors. The use of MLX means that the models can be trained and perform predictions efficiently on personal machines with only modest memory requirements. To train, validate and test the Transformer model we solve the 2D heat conduction problem using central finite differences. Each finite difference solution in these sets is initialized with four random Dirichlet boundary conditions, a uniform but random internal temperature distribution and a randomly selected thermal diffusivity. Validation is performed in-line during training to monitor against over-fitting. The excellent performance of the trained model is demonstrated by predicting the evolution of the temperature field to steady state for the unseen test set of conditions.http://www.sciencedirect.com/science/article/pii/S2590123025009478Physics-informed transformersMLX frameworkHeat conduction
spellingShingle Stavros Kassinos
Alessio Alexiadis
Beyond language: Applying MLX transformers to engineering physicsMLX Beyond Language
Results in Engineering
Physics-informed transformers
MLX framework
Heat conduction
title Beyond language: Applying MLX transformers to engineering physicsMLX Beyond Language
title_full Beyond language: Applying MLX transformers to engineering physicsMLX Beyond Language
title_fullStr Beyond language: Applying MLX transformers to engineering physicsMLX Beyond Language
title_full_unstemmed Beyond language: Applying MLX transformers to engineering physicsMLX Beyond Language
title_short Beyond language: Applying MLX transformers to engineering physicsMLX Beyond Language
title_sort beyond language applying mlx transformers to engineering physicsmlx beyond language
topic Physics-informed transformers
MLX framework
Heat conduction
url http://www.sciencedirect.com/science/article/pii/S2590123025009478
work_keys_str_mv AT stavroskassinos beyondlanguageapplyingmlxtransformerstoengineeringphysicsmlxbeyondlanguage
AT alessioalexiadis beyondlanguageapplyingmlxtransformerstoengineeringphysicsmlxbeyondlanguage