XElemNet: towards explainable AI for deep neural networks in materials science

Abstract Recent progress in deep learning has significantly impacted materials science, leading to accelerated material discovery and innovation. ElemNet, a deep neural network model that predicts formation energy from elemental compositions, exemplifies the application of deep learning techniques i...

Full description

Saved in:
Bibliographic Details
Main Authors: Kewei Wang, Vishu Gupta, Claire Songhyun Lee, Yuwei Mao, Muhammed Nur Talha Kilic, Youjia Li, Zanhua Huang, Wei-keng Liao, Alok Choudhary, Ankit Agrawal
Format: Article
Language:English
Published: Nature Portfolio 2024-10-01
Series:Scientific Reports
Online Access:https://doi.org/10.1038/s41598-024-76535-2
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Recent progress in deep learning has significantly impacted materials science, leading to accelerated material discovery and innovation. ElemNet, a deep neural network model that predicts formation energy from elemental compositions, exemplifies the application of deep learning techniques in this field. However, the “black-box” nature of deep learning models often raises concerns about their interpretability and reliability. In this study, we propose XElemNet to explore the interpretability of ElemNet by applying a series of explainable artificial intelligence (XAI) techniques, focusing on post-hoc analysis and model transparency. The experiments with artificial binary datasets reveal ElemNet’s effectiveness in predicting convex hulls of element-pair systems across periodic table groups, indicating its capability to effectively discern elemental interactions in most cases. Additionally, feature importance analysis within ElemNet highlights alignment with chemical properties of elements such as reactivity and electronegativity. XElemNet provides insights into the strengths and limitations of ElemNet and offers a potential pathway for explaining other deep learning models in materials science.
ISSN:2045-2322