Machine learning Hubbard parameters with equivariant neural networks
Abstract Density-functional theory with extended Hubbard functionals (DFT + U + V) provides a robust framework to accurately describe complex materials containing transition-metal or rare-earth elements. It does so by mitigating self-interaction errors inherent to semi-local functionals which are pa...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2025-01-01
|
Series: | npj Computational Materials |
Online Access: | https://doi.org/10.1038/s41524-024-01501-5 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832585610410328064 |
---|---|
author | Martin Uhrin Austin Zadoks Luca Binci Nicola Marzari Iurii Timrov |
author_facet | Martin Uhrin Austin Zadoks Luca Binci Nicola Marzari Iurii Timrov |
author_sort | Martin Uhrin |
collection | DOAJ |
description | Abstract Density-functional theory with extended Hubbard functionals (DFT + U + V) provides a robust framework to accurately describe complex materials containing transition-metal or rare-earth elements. It does so by mitigating self-interaction errors inherent to semi-local functionals which are particularly pronounced in systems with partially-filled d and f electronic states. However, achieving accuracy in this approach hinges upon the accurate determination of the on-site U and inter-site V Hubbard parameters. In practice, these are obtained either by semi-empirical tuning, requiring prior knowledge, or, more correctly, by using predictive but expensive first-principles calculations. Here, we present a machine learning model based on equivariant neural networks which uses atomic occupation matrices as descriptors, directly capturing the electronic structure, local chemical environment, and oxidation states of the system at hand. We target here the prediction of Hubbard parameters computed self-consistently with iterative linear-response calculations, as implemented in density-functional perturbation theory (DFPT), and structural relaxations. Remarkably, when trained on data from 12 materials spanning various crystal structures and compositions, our model achieves mean absolute relative errors of 3% and 5% for Hubbard U and V parameters, respectively. By circumventing computationally expensive DFT or DFPT self-consistent protocols, our model significantly expedites the prediction of Hubbard parameters with negligible computational overhead, while approaching the accuracy of DFPT. Moreover, owing to its robust transferability, the model facilitates accelerated materials discovery and design via high-throughput calculations, with relevance for various technological applications. |
format | Article |
id | doaj-art-ff4ae335b6854a4a854ca7e3d3499f92 |
institution | Kabale University |
issn | 2057-3960 |
language | English |
publishDate | 2025-01-01 |
publisher | Nature Portfolio |
record_format | Article |
series | npj Computational Materials |
spelling | doaj-art-ff4ae335b6854a4a854ca7e3d3499f922025-01-26T12:43:05ZengNature Portfolionpj Computational Materials2057-39602025-01-0111111010.1038/s41524-024-01501-5Machine learning Hubbard parameters with equivariant neural networksMartin Uhrin0Austin Zadoks1Luca Binci2Nicola Marzari3Iurii Timrov4Theory and Simulation of Materials (THEOS), and National Centre for Computational Design and Discovery of Novel Materials (MARVEL), École Polytechnique Fédérale de Lausanne (EPFL)Theory and Simulation of Materials (THEOS), and National Centre for Computational Design and Discovery of Novel Materials (MARVEL), École Polytechnique Fédérale de Lausanne (EPFL)Theory and Simulation of Materials (THEOS), and National Centre for Computational Design and Discovery of Novel Materials (MARVEL), École Polytechnique Fédérale de Lausanne (EPFL)Theory and Simulation of Materials (THEOS), and National Centre for Computational Design and Discovery of Novel Materials (MARVEL), École Polytechnique Fédérale de Lausanne (EPFL)Theory and Simulation of Materials (THEOS), and National Centre for Computational Design and Discovery of Novel Materials (MARVEL), École Polytechnique Fédérale de Lausanne (EPFL)Abstract Density-functional theory with extended Hubbard functionals (DFT + U + V) provides a robust framework to accurately describe complex materials containing transition-metal or rare-earth elements. It does so by mitigating self-interaction errors inherent to semi-local functionals which are particularly pronounced in systems with partially-filled d and f electronic states. However, achieving accuracy in this approach hinges upon the accurate determination of the on-site U and inter-site V Hubbard parameters. In practice, these are obtained either by semi-empirical tuning, requiring prior knowledge, or, more correctly, by using predictive but expensive first-principles calculations. Here, we present a machine learning model based on equivariant neural networks which uses atomic occupation matrices as descriptors, directly capturing the electronic structure, local chemical environment, and oxidation states of the system at hand. We target here the prediction of Hubbard parameters computed self-consistently with iterative linear-response calculations, as implemented in density-functional perturbation theory (DFPT), and structural relaxations. Remarkably, when trained on data from 12 materials spanning various crystal structures and compositions, our model achieves mean absolute relative errors of 3% and 5% for Hubbard U and V parameters, respectively. By circumventing computationally expensive DFT or DFPT self-consistent protocols, our model significantly expedites the prediction of Hubbard parameters with negligible computational overhead, while approaching the accuracy of DFPT. Moreover, owing to its robust transferability, the model facilitates accelerated materials discovery and design via high-throughput calculations, with relevance for various technological applications.https://doi.org/10.1038/s41524-024-01501-5 |
spellingShingle | Martin Uhrin Austin Zadoks Luca Binci Nicola Marzari Iurii Timrov Machine learning Hubbard parameters with equivariant neural networks npj Computational Materials |
title | Machine learning Hubbard parameters with equivariant neural networks |
title_full | Machine learning Hubbard parameters with equivariant neural networks |
title_fullStr | Machine learning Hubbard parameters with equivariant neural networks |
title_full_unstemmed | Machine learning Hubbard parameters with equivariant neural networks |
title_short | Machine learning Hubbard parameters with equivariant neural networks |
title_sort | machine learning hubbard parameters with equivariant neural networks |
url | https://doi.org/10.1038/s41524-024-01501-5 |
work_keys_str_mv | AT martinuhrin machinelearninghubbardparameterswithequivariantneuralnetworks AT austinzadoks machinelearninghubbardparameterswithequivariantneuralnetworks AT lucabinci machinelearninghubbardparameterswithequivariantneuralnetworks AT nicolamarzari machinelearninghubbardparameterswithequivariantneuralnetworks AT iuriitimrov machinelearninghubbardparameterswithequivariantneuralnetworks |