SIMPROV: Provenance capturing for simulation studies.

Improving interpretability and reusability has become paramount for modeling and simulation studies. Provenance, which encompasses information about the entities, activities, and agents involved in producing a model, experiment, or data, is pivotal in achieving this goal. However, capturing provenan...

Full description

Saved in:
Bibliographic Details
Main Authors: Andreas Ruscheinski, Anja Wolpers, Philipp Henning, Pia Wilsdorf, Adelinde M Uhrmacher
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2025-01-01
Series:PLoS ONE
Online Access:https://doi.org/10.1371/journal.pone.0327607
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Improving interpretability and reusability has become paramount for modeling and simulation studies. Provenance, which encompasses information about the entities, activities, and agents involved in producing a model, experiment, or data, is pivotal in achieving this goal. However, capturing provenance in simulation studies presents a tremendous challenge due to the diverse software systems employed by modelers and the various entities and activities to be considered. Existing methods only automatically capture partial provenance from individual software systems, leaving gaps in the overall story of a simulation study. To address this limitation, we introduce a lightweight method that can record the provenance of complete simulation studies by monitoring the modeler in their familiar yet heterogeneous work environment, posing as few restrictions as possible. The approach emphasizes a clear separation of concerns between provenance capturers, which collect data from the diverse software systems used, and a provenance builder, which assembles this information into a coherent provenance graph. Furthermore, we provide a web interface that enables modelers to enhance and explore their provenance graphs. We showcase the practicality of SIMPROV through two cell biological case studies.
ISSN:1932-6203