Diverse explanations from data-driven and domain-driven perspectives in the physical sciences

Machine learning methods have been remarkably successful in material science, providing novel scientific insights, guiding future laboratory experiments, and accelerating materials discovery. Despite the promising performance of these models, understanding the decisions they make is also essential t...

Full description

Saved in:
Bibliographic Details
Main Authors: Sichao Li, Xin Wang, Amanda Barnard
Format: Article
Language:English
Published: IOP Publishing 2025-01-01
Series:Machine Learning: Science and Technology
Subjects:
Online Access:https://doi.org/10.1088/2632-2153/ad9137
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850025444166860800
author Sichao Li
Xin Wang
Amanda Barnard
author_facet Sichao Li
Xin Wang
Amanda Barnard
author_sort Sichao Li
collection DOAJ
description Machine learning methods have been remarkably successful in material science, providing novel scientific insights, guiding future laboratory experiments, and accelerating materials discovery. Despite the promising performance of these models, understanding the decisions they make is also essential to ensure the scientific value of their outcomes. However, there is a recent and ongoing debate about the diversity of explanations, which potentially leads to scientific inconsistency. This Perspective explores the sources and implications of these diverse explanations in ML applications for physical sciences. Through three case studies in materials science and molecular property prediction, we examine how different models, explanation methods, levels of feature attribution, and stakeholder needs can result in varying interpretations of ML outputs. Our analysis underscores the importance of considering multiple perspectives when interpreting ML models in scientific contexts and highlights the critical need for scientists to maintain control over the interpretation process, balancing data-driven insights with domain expertise to meet specific scientific needs. By fostering a comprehensive understanding of these inconsistencies, we aim to contribute to the responsible integration of eXplainable artificial intelligence into physical sciences and improve the trustworthiness of ML applications in scientific discovery.
format Article
id doaj-art-882e28bd2f1849878c827c89d0c76da2
institution DOAJ
issn 2632-2153
language English
publishDate 2025-01-01
publisher IOP Publishing
record_format Article
series Machine Learning: Science and Technology
spelling doaj-art-882e28bd2f1849878c827c89d0c76da22025-08-20T03:00:50ZengIOP PublishingMachine Learning: Science and Technology2632-21532025-01-016101300210.1088/2632-2153/ad9137Diverse explanations from data-driven and domain-driven perspectives in the physical sciencesSichao Li0https://orcid.org/0000-0002-0097-6754Xin Wang1Amanda Barnard2https://orcid.org/0000-0002-4784-2382School of Computing, Australian National University , Canberra, AustraliaSchool of Computing, Australian National University , Canberra, AustraliaSchool of Computing, Australian National University , Canberra, AustraliaMachine learning methods have been remarkably successful in material science, providing novel scientific insights, guiding future laboratory experiments, and accelerating materials discovery. Despite the promising performance of these models, understanding the decisions they make is also essential to ensure the scientific value of their outcomes. However, there is a recent and ongoing debate about the diversity of explanations, which potentially leads to scientific inconsistency. This Perspective explores the sources and implications of these diverse explanations in ML applications for physical sciences. Through three case studies in materials science and molecular property prediction, we examine how different models, explanation methods, levels of feature attribution, and stakeholder needs can result in varying interpretations of ML outputs. Our analysis underscores the importance of considering multiple perspectives when interpreting ML models in scientific contexts and highlights the critical need for scientists to maintain control over the interpretation process, balancing data-driven insights with domain expertise to meet specific scientific needs. By fostering a comprehensive understanding of these inconsistencies, we aim to contribute to the responsible integration of eXplainable artificial intelligence into physical sciences and improve the trustworthiness of ML applications in scientific discovery.https://doi.org/10.1088/2632-2153/ad9137XAIphysical scienceexplanations
spellingShingle Sichao Li
Xin Wang
Amanda Barnard
Diverse explanations from data-driven and domain-driven perspectives in the physical sciences
Machine Learning: Science and Technology
XAI
physical science
explanations
title Diverse explanations from data-driven and domain-driven perspectives in the physical sciences
title_full Diverse explanations from data-driven and domain-driven perspectives in the physical sciences
title_fullStr Diverse explanations from data-driven and domain-driven perspectives in the physical sciences
title_full_unstemmed Diverse explanations from data-driven and domain-driven perspectives in the physical sciences
title_short Diverse explanations from data-driven and domain-driven perspectives in the physical sciences
title_sort diverse explanations from data driven and domain driven perspectives in the physical sciences
topic XAI
physical science
explanations
url https://doi.org/10.1088/2632-2153/ad9137
work_keys_str_mv AT sichaoli diverseexplanationsfromdatadrivenanddomaindrivenperspectivesinthephysicalsciences
AT xinwang diverseexplanationsfromdatadrivenanddomaindrivenperspectivesinthephysicalsciences
AT amandabarnard diverseexplanationsfromdatadrivenanddomaindrivenperspectivesinthephysicalsciences