Improving 3D deep learning segmentation with biophysically motivated cell synthesis

Abstract Biomedical research increasingly relies on three-dimensional (3D) cell culture models and artificial-intelligence-based analysis can potentially facilitate a detailed and accurate feature extraction on a single-cell level. However, this requires for a precise segmentation of 3D cell dataset...

Full description

Saved in:
Bibliographic Details
Main Authors: Roman Bruch, Mario Vitacolonna, Elina Nürnberg, Simeon Sauer, Rüdiger Rudolf, Markus Reischl
Format: Article
Language:English
Published: Nature Portfolio 2025-01-01
Series:Communications Biology
Online Access:https://doi.org/10.1038/s42003-025-07469-2
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1841544359896416256
author Roman Bruch
Mario Vitacolonna
Elina Nürnberg
Simeon Sauer
Rüdiger Rudolf
Markus Reischl
author_facet Roman Bruch
Mario Vitacolonna
Elina Nürnberg
Simeon Sauer
Rüdiger Rudolf
Markus Reischl
author_sort Roman Bruch
collection DOAJ
description Abstract Biomedical research increasingly relies on three-dimensional (3D) cell culture models and artificial-intelligence-based analysis can potentially facilitate a detailed and accurate feature extraction on a single-cell level. However, this requires for a precise segmentation of 3D cell datasets, which in turn demands high-quality ground truth for training. Manual annotation, the gold standard for ground truth data, is too time-consuming and thus not feasible for the generation of large 3D training datasets. To address this, we present a framework for generating 3D training data, which integrates biophysical modeling for realistic cell shape and alignment. Our approach allows the in silico generation of coherent membrane and nuclei signals, that enable the training of segmentation models utilizing both channels for improved performance. Furthermore, we present a generative adversarial network (GAN) training scheme that generates not only image data but also matching labels. Quantitative evaluation shows superior performance of biophysical motivated synthetic training data, even outperforming manual annotation and pretrained models. This underscores the potential of incorporating biophysical modeling for enhancing synthetic training data quality.
format Article
id doaj-art-5fad8f72b4ab44c387554effb367c8c1
institution Kabale University
issn 2399-3642
language English
publishDate 2025-01-01
publisher Nature Portfolio
record_format Article
series Communications Biology
spelling doaj-art-5fad8f72b4ab44c387554effb367c8c12025-01-12T12:35:51ZengNature PortfolioCommunications Biology2399-36422025-01-018111310.1038/s42003-025-07469-2Improving 3D deep learning segmentation with biophysically motivated cell synthesisRoman Bruch0Mario Vitacolonna1Elina Nürnberg2Simeon Sauer3Rüdiger Rudolf4Markus Reischl5Institute for Automation and Applied Informatics, Karlsruhe Institute of TechnologyInstitute of Molecular and Cell Biology, Mannheim University of Applied SciencesInstitute of Molecular and Cell Biology, Mannheim University of Applied SciencesInstitute of Molecular and Cell Biology, Mannheim University of Applied SciencesInstitute of Molecular and Cell Biology, Mannheim University of Applied SciencesInstitute for Automation and Applied Informatics, Karlsruhe Institute of TechnologyAbstract Biomedical research increasingly relies on three-dimensional (3D) cell culture models and artificial-intelligence-based analysis can potentially facilitate a detailed and accurate feature extraction on a single-cell level. However, this requires for a precise segmentation of 3D cell datasets, which in turn demands high-quality ground truth for training. Manual annotation, the gold standard for ground truth data, is too time-consuming and thus not feasible for the generation of large 3D training datasets. To address this, we present a framework for generating 3D training data, which integrates biophysical modeling for realistic cell shape and alignment. Our approach allows the in silico generation of coherent membrane and nuclei signals, that enable the training of segmentation models utilizing both channels for improved performance. Furthermore, we present a generative adversarial network (GAN) training scheme that generates not only image data but also matching labels. Quantitative evaluation shows superior performance of biophysical motivated synthetic training data, even outperforming manual annotation and pretrained models. This underscores the potential of incorporating biophysical modeling for enhancing synthetic training data quality.https://doi.org/10.1038/s42003-025-07469-2
spellingShingle Roman Bruch
Mario Vitacolonna
Elina Nürnberg
Simeon Sauer
Rüdiger Rudolf
Markus Reischl
Improving 3D deep learning segmentation with biophysically motivated cell synthesis
Communications Biology
title Improving 3D deep learning segmentation with biophysically motivated cell synthesis
title_full Improving 3D deep learning segmentation with biophysically motivated cell synthesis
title_fullStr Improving 3D deep learning segmentation with biophysically motivated cell synthesis
title_full_unstemmed Improving 3D deep learning segmentation with biophysically motivated cell synthesis
title_short Improving 3D deep learning segmentation with biophysically motivated cell synthesis
title_sort improving 3d deep learning segmentation with biophysically motivated cell synthesis
url https://doi.org/10.1038/s42003-025-07469-2
work_keys_str_mv AT romanbruch improving3ddeeplearningsegmentationwithbiophysicallymotivatedcellsynthesis
AT mariovitacolonna improving3ddeeplearningsegmentationwithbiophysicallymotivatedcellsynthesis
AT elinanurnberg improving3ddeeplearningsegmentationwithbiophysicallymotivatedcellsynthesis
AT simeonsauer improving3ddeeplearningsegmentationwithbiophysicallymotivatedcellsynthesis
AT rudigerrudolf improving3ddeeplearningsegmentationwithbiophysicallymotivatedcellsynthesis
AT markusreischl improving3ddeeplearningsegmentationwithbiophysicallymotivatedcellsynthesis