Regularization for Unconditional Image Diffusion Models via Shifted Data Augmentation
Diffusion models are a powerful class of techniques in ML for generating realistic data, but they are highly prone to overfitting, especially with limited training data. While data augmentation such as image rotation can mitigate this issue, it often causes leakage, where augmented content appears i...
Saved in:
| Main Authors: | , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/11048911/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Diffusion models are a powerful class of techniques in ML for generating realistic data, but they are highly prone to overfitting, especially with limited training data. While data augmentation such as image rotation can mitigate this issue, it often causes leakage, where augmented content appears in generated samples. In this paper, we propose a novel regularization framework, called shifted data-augmentation (SDA), for training unconditional diffusion models. SDA introduces an auxiliary diffusion path using transformed data and the noise-shift technique alongside the standard path with original data. This dual-path structure enables effective regularization while suppressing leakage through a trajectory shift in the diffusion process. We implement SDA with image rotation as its most basic and interpretable configuration. We also conduct synthetic and empirical analyses demonstrating that SDA effectively leverages the regularization benefit of image rotation. In particular, SDA yielded notable performance in training with limited data. |
|---|---|
| ISSN: | 2169-3536 |