DBDST-Net: Dual-Branch Decoupled Image Style Transfer Network

The image style transfer task aims to apply the style characteristics of a reference image to a content image, generating a new stylized result. While many existing methods focus on designing feature transfer modules and have achieved promising results, they often overlook the entanglement between c...

Full description

Saved in:
Bibliographic Details
Main Authors: Na Su, Jingtao Wang, Jingjing Zhang, Ying Li, Yun Pan
Format: Article
Language:English
Published: MDPI AG 2025-06-01
Series:Information
Subjects:
Online Access:https://www.mdpi.com/2078-2489/16/7/561
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849733186056093696
author Na Su
Jingtao Wang
Jingjing Zhang
Ying Li
Yun Pan
author_facet Na Su
Jingtao Wang
Jingjing Zhang
Ying Li
Yun Pan
author_sort Na Su
collection DOAJ
description The image style transfer task aims to apply the style characteristics of a reference image to a content image, generating a new stylized result. While many existing methods focus on designing feature transfer modules and have achieved promising results, they often overlook the entanglement between content and style features after transfer, making effective separation challenging. To address this issue, we propose a Dual-Branch Decoupled Image Style Transfer Network (DBDST-Net) to better disentangle content and style representations. The network consists of two branches: a Content Feature Decoupling Branch, which captures fine-grained content structures for more precise content separation, and a Style Feature Decoupling Branch, which enhances sensitivity to style-specific attributes. To further improve the decoupling performance, we introduce a dense-regressive loss that minimizes the discrepancy between the original content image and the content reconstructed from the stylized output, thereby promoting the independence of content and style features while enhancing image quality. Additionally, to mitigate the limited availability of style data, we employ the Stable Diffusion model to generate stylized samples for data augmentation. Extensive experiments demonstrate that our method achieves a better balance between content preservation and style rendering compared to existing approaches.
format Article
id doaj-art-41a9948f9f07421faae9035dcc380e68
institution DOAJ
issn 2078-2489
language English
publishDate 2025-06-01
publisher MDPI AG
record_format Article
series Information
spelling doaj-art-41a9948f9f07421faae9035dcc380e682025-08-20T03:08:06ZengMDPI AGInformation2078-24892025-06-0116756110.3390/info16070561DBDST-Net: Dual-Branch Decoupled Image Style Transfer NetworkNa Su0Jingtao Wang1Jingjing Zhang2Ying Li3Yun Pan4State Key Laboratory of Media Convergence and Communication, Communication University of China, Beijing 100024, ChinaState Key Laboratory of Media Convergence and Communication, Communication University of China, Beijing 100024, ChinaState Key Laboratory of Media Convergence and Communication, Communication University of China, Beijing 100024, ChinaState Key Laboratory of Media Convergence and Communication, Communication University of China, Beijing 100024, ChinaState Key Laboratory of Media Convergence and Communication, Communication University of China, Beijing 100024, ChinaThe image style transfer task aims to apply the style characteristics of a reference image to a content image, generating a new stylized result. While many existing methods focus on designing feature transfer modules and have achieved promising results, they often overlook the entanglement between content and style features after transfer, making effective separation challenging. To address this issue, we propose a Dual-Branch Decoupled Image Style Transfer Network (DBDST-Net) to better disentangle content and style representations. The network consists of two branches: a Content Feature Decoupling Branch, which captures fine-grained content structures for more precise content separation, and a Style Feature Decoupling Branch, which enhances sensitivity to style-specific attributes. To further improve the decoupling performance, we introduce a dense-regressive loss that minimizes the discrepancy between the original content image and the content reconstructed from the stylized output, thereby promoting the independence of content and style features while enhancing image quality. Additionally, to mitigate the limited availability of style data, we employ the Stable Diffusion model to generate stylized samples for data augmentation. Extensive experiments demonstrate that our method achieves a better balance between content preservation and style rendering compared to existing approaches.https://www.mdpi.com/2078-2489/16/7/561image style transferdiffusion modulefeature decouplingdegradation representations
spellingShingle Na Su
Jingtao Wang
Jingjing Zhang
Ying Li
Yun Pan
DBDST-Net: Dual-Branch Decoupled Image Style Transfer Network
Information
image style transfer
diffusion module
feature decoupling
degradation representations
title DBDST-Net: Dual-Branch Decoupled Image Style Transfer Network
title_full DBDST-Net: Dual-Branch Decoupled Image Style Transfer Network
title_fullStr DBDST-Net: Dual-Branch Decoupled Image Style Transfer Network
title_full_unstemmed DBDST-Net: Dual-Branch Decoupled Image Style Transfer Network
title_short DBDST-Net: Dual-Branch Decoupled Image Style Transfer Network
title_sort dbdst net dual branch decoupled image style transfer network
topic image style transfer
diffusion module
feature decoupling
degradation representations
url https://www.mdpi.com/2078-2489/16/7/561
work_keys_str_mv AT nasu dbdstnetdualbranchdecoupledimagestyletransfernetwork
AT jingtaowang dbdstnetdualbranchdecoupledimagestyletransfernetwork
AT jingjingzhang dbdstnetdualbranchdecoupledimagestyletransfernetwork
AT yingli dbdstnetdualbranchdecoupledimagestyletransfernetwork
AT yunpan dbdstnetdualbranchdecoupledimagestyletransfernetwork