Skip to content

A collection of PyTorch implementations for popular generative models, including Variational Autoencoders (VAE), Generative Adversarial Networks (GAN), Energy-Based Models (EBM), and Diffusion Models.

License

Notifications You must be signed in to change notification settings

lvzongyao/generative-models

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Generative Models

This repository contains implementations of various generative models including Diffusion Models, Energy-Based Models (EBM), Generative Adversarial Networks (GAN), and Variational Autoencoders (VAE).

Installation

  1. Clone the repository:

    git clone https://github.com/lvzongyao/generative-models.git
    cd generative-models
  2. Create a virtual environment and activate it:

    python -m venv venv
    source venv/bin/activate  # On Windows use `venv\Scripts\activate`

Usage

Each model can be trained using the corresponding script in the models directory. Below are the instructions for each model:

Variational Autoencoder (VAE)

python models/vae.py --dataset <dataset> --data_path <path_to_data> --epochs <num_epochs> --batch_size <batch_size> --learning_rate <learning_rate> --latent_dim <latent_dim> --output_dir <output_dir>

For example:

python models/vae.py --dataset cifar10 --data_path ./data --epochs 20 --batch_size 64 --learning_rate 0.001 --latent_dim 20 --output_dir ./output

Generative Adversarial Network (GAN)

python models/gan.py --dataset <dataset> --data_path <path_to_data> --epochs <num_epochs> --batch_size <batch_size> --learning_rate <learning_rate> --latent_dim <latent_dim> --output_dir <output_dir>

For example:

python models/gan.py --dataset cifar10 --data_path ./data --epochs 20 --batch_size 64 --learning_rate 0.0002 --latent_dim 100 --output_dir ./output

Energy-Based Model (EBM)

python models/ebm.py --dataset <dataset> --data_path <path_to_data> --epochs <num_epochs> --batch_size <batch_size> --learning_rate <learning_rate> --latent_dim <latent_dim> --sample_steps <sample_steps> --step_size <step_size> --noise_scale <noise_scale> --output_dir <output_dir>

For example:

python models/ebm.py --dataset cifar10 --data_path ./data --epochs 20 --batch_size 64 --learning_rate 0.0001 --sample_steps 10 --step_size 0.01 --noise_scale 0.005 --output_dir ./output

Diffusion Model

python models/diffusion.py --dataset <dataset> --data_path <path_to_data> --epochs <num_epochs> --batch_size <batch_size> --learning_rate <learning_rate> --timesteps <timesteps> --output_dir <output_dir>

For example:

python models/diffusion.py --dataset cifar10 --data_path ./data --epochs 20 --batch_size 64 --learning_rate 0.0001 --timesteps 1000 --output_dir ./output

Model Descriptions

Variational Autoencoder (VAE) [paper]

A VAE is a type of generative model that learns to encode data into a latent space and then decode it back to the original space. The model is trained to maximize the evidence lower bound (ELBO) on the data likelihood.

Generative Adversarial Network (GAN) [paper]

A GAN consists of two neural networks, a generator and a discriminator, that are trained together. The generator learns to generate realistic data, while the discriminator learns to distinguish between real and generated data.

Energy-Based Model (EBM) [paper]

An energy-based model is a type of generative model that learns to assign low energy to data points from the data distribution and high energy to other points. The model is trained using a contrastive divergence algorithm.

Diffusion Model [paper]

A diffusion model is a type of generative model that learns to generate data by reversing a diffusion process. The model is trained to denoise data that has been progressively corrupted by noise.

License

This repository is licensed under the MIT License. See the LICENSE file for more information.

About

A collection of PyTorch implementations for popular generative models, including Variational Autoencoders (VAE), Generative Adversarial Networks (GAN), Energy-Based Models (EBM), and Diffusion Models.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages