Research on image generation technology based on deep learning
In the realm of image creation, deep learning stands out as an effective and valuable machine learning technique. Deep learning can automatically learn the intrinsic features of images, reaching the goal of generating high-quality images by utilizing multi-layer neural network models. In recent year...
Saved in:
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
EDP Sciences
2025-01-01
|
Series: | ITM Web of Conferences |
Online Access: | https://www.itm-conferences.org/articles/itmconf/pdf/2025/01/itmconf_dai2024_02011.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In the realm of image creation, deep learning stands out as an effective and valuable machine learning technique. Deep learning can automatically learn the intrinsic features of images, reaching the goal of generating high-quality images by utilizing multi-layer neural network models. In recent years, deep learning-based image generation technology has made significant progress. This paper mainly introduces two main methods: generating adversarial network (GAN) and variational autoencoder (VAE). GAN has been widely used in image generation, image repair and other aspects. VAE has a good performance in image generation, image classification and so on. However, current image generation technologies still face problems such as diversity and insufficient authenticity. Based on the above problems, this paper analyzes the methods of improving and optimizing the mainstream image generation algorithm from the perspectives of improving and optimizing the loss function, improving the space modeling, revising the structure of both the generator and discriminator, while speeding up the training process. Furthermore, the performance of these methods in image generation tasks is compared, and the strengths and weaknesses of each approach are evaluated. Image generation has emerged as a prominent research area in contemporary academia, with a high possibility of exploration and practice. |
---|---|
ISSN: | 2271-2097 |