Liver segmentation network based on detail enhancement and multi-scale feature fusion

Abstract Due to the low contrast of abdominal CT (Computer Tomography) images and the similar color and shape of the liver to other organs such as the spleen, stomach, and kidneys, liver segmentation presents significant challenges. Additionally, 2D CT images obtained from different angles (such as...

Full description

Saved in:
Bibliographic Details
Main Authors: Lu Tinglan, Qin Jun, Qin Guihe, Shi Weili, Zhang Wentao
Format: Article
Language:English
Published: Nature Portfolio 2025-01-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-024-78917-y
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Due to the low contrast of abdominal CT (Computer Tomography) images and the similar color and shape of the liver to other organs such as the spleen, stomach, and kidneys, liver segmentation presents significant challenges. Additionally, 2D CT images obtained from different angles (such as sagittal, coronal, and transverse planes) increase the diversity of liver morphology and the complexity of segmentation. To address these issues, this paper proposes a Detail Enhanced Convolution (DE Conv) to improve liver feature learning and thereby enhance liver segmentation performance. Furthermore, to enable the model to better learn liver features at different scales, a Multi-Scale Feature Fusion module (MSFF) is added to the skip connections in the model. The MSFF module enhances the capture of global features, thus improving the accuracy of the liver segmentation model. Through the aforementioned research, this paper proposes a liver segmentation network based on detail enhancement and multi-scale feature fusion (DEMF-Net). We conducted extensive experiments on the LiTS17 dataset, and the results demonstrate that the DEMF-Net model achieved significant improvements across various evaluation metrics. Therefore, the proposed DEMF-Net model can achieve precise liver segmentation.
ISSN:2045-2322