Comprehensive Style Transfer for Facial Images Using Enhanced Feature Attribution in Generative Adversarial Nets
Image-to-image translation is a fundamental task in computer vision that transforms images between domains while preserving essential content. Although adaptive instance normalization (AdaIN) is widely used for style transfer, its reliance on simple statistical measures (mean and variance) may limit...
Saved in:
| Main Authors: | Yongseon Yoo, Seonggyu Kim, Jong-Min Lee |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/11017600/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Local Binary Pattern–Cycle Generative Adversarial Network Transfer: Transforming Image Style from Day to Night
by: Abeer Almohamade, et al.
Published: (2025-03-01) -
VariGAN: Enhancing Image Style Transfer via UNet Generator, Depthwise Discriminator, and LPIPS Loss in Adversarial Learning Framework
by: Dawei Guan, et al.
Published: (2025-04-01) -
End-to-End Design of Webtoon-Style Portrait Stylization System for Real-World Demo Booth
by: Sojeong Kim, et al.
Published: (2025-01-01) -
Applying deep learning for style transfer in digital art: enhancing creative expression through neural networks
by: Shijun Zhang, et al.
Published: (2025-04-01) -
Improving the Parameterization of Complex Subsurface Flow Properties With Style‐Based Generative Adversarial Network (StyleGAN)
by: Wei Ling, et al.
Published: (2024-11-01)