A comparative review of latent spaces in latent diffusion model with other generative models
Generative models have become predominant in modern machine learning, enabling the synthesis of high-quality and diverse outputs. This study examines the mechanisms of various generative models, namely, Generative Adversarial Networks (GANs), Variational Autoencoders (VAEs), and Diffusion Models (DM...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/175639 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Generative models have become predominant in modern machine learning, enabling the synthesis of high-quality and diverse outputs. This study examines the mechanisms of various generative models, namely, Generative Adversarial Networks (GANs), Variational Autoencoders (VAEs), and Diffusion Models (DMs), and the latent spaces of these models, focusing on their structure and role in generating efficient and adaptable outputs. Despite the progress in the optimization of these models, the latent spaces within diffusion models, particularly LDMs, remain less explored compared to GANs and VAEs, mainly due to their relatively complex training process. This paper aims to address this gap by reviewing and comparing the latent spaces in LDMs with those in GANs and VAEs.
Through this research, we seek to contribute to the understanding of latent diffusion models and provide insights that may assist future explorations on generative model design. |
---|