Skip to content

A Collection of Variational Autoencoders (VAE) in PyTorch.

License

Notifications You must be signed in to change notification settings

wuyx517/PyTorch-VAE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PyTorch VAE

A Collection of Variational AutoEncoders (VAEs) implemented in PyTorch with focus on reproducibility. The aim of this project is to provide a quick and simple working example for many of the cool VAE models out there. All the models are trained on the CelebA dataset for consistency and comparison. The architecture of all the models are kept as similar as possible with the same layers, except for cases where the original paper necessitates a radically different architecture.

Requirements

  • Python >= 3.5
  • PyTorch >= 1.3
  • Pytorch Lightning >= 0.5.3 (GitHub Repo)

Installation

$ git clone https://github.com/AntixK/PyTorch-VAE
$ cd PyTorch-VAE
$ pip install -r requirements.txt

Usage

$ cd PyTorch-VAE
$ python run.py -c configs/<config-file-name.yaml>

Config file template

model_params:
  name: "<name of VAE model>"
  in_channels: 3
  latent_dim: 

exp_params:
  data_path: "<path to the celebA dataset>"
  img_size: 64    # Models are designed to work for this size
  batch_size: 64  # Better to have a square number
  LR: 0.005

trainer_params:
  gpus: 1         
  max_nb_epochs: 50

logging_params:
  save_dir: "logs/"
  name: "<experiment name>"
  manual_seed: 

Results

Model Paper Reconstruction Samples
VAE Link
WAE - MMD (RBF Kernel) Link
WAE - MMD (IMQ Kernel) Link
Beta-VAE Link
IWAE (5 Samples) Link
DFCVAE Link
MSSIM VAE Link

TODO

  • VanillaVAE
  • Conditional VAE
  • Gamma VAE
  • Beta VAE
  • Beta TC-VAE
  • DFC VAE
  • MSSIM VAE
  • InfoVAE (MMD-VAE)
  • WAE-MMD
  • AAE
  • TwoStageVAE
  • VAE-GAN
  • Vamp VAE
  • HVAE (VAE with Vamp Prior)
  • IWAE
  • VLAE
  • FactorVAE
  • PixelVAE
  • VQVAE
  • StyleVAE

Contributing

If you have trained a better model using these implementations by finetuning the hyper-params in the config file, I would be happy to include your result (along with your config file) in this repo, citing your name 😊.

About

A Collection of Variational Autoencoders (VAE) in PyTorch.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%