Skip to content

Latest commit

 

History

History
97 lines (77 loc) · 4.07 KB

README.md

File metadata and controls

97 lines (77 loc) · 4.07 KB

PyTorch VAE

A Collection of Variational AutoEncoders (VAEs) implemented in PyTorch with focus on reproducibility. The aim of this project is to provide a quick and simple working example for many of the cool VAE models out there.

Requirements

  • Python >= 3.5
  • PyTorch >= 1.3
  • Pytorch Lightning >= 0.5.3 (GitHub Repo)

Installation

$ git clone https://github.com/AntixK/PyTorch-VAE
$ cd PyTorch-VAE
$ pip install -r requirements.txt

Usage

$ cd PyTorch-VAE
$ python run.py -c configs/<config-file-name.yaml>

Model Paper Reconstruction Samples
VAE Link
WAE - MMD (RBF Kernel) Link
WAE - MMD (IMQ Kernel) Link
Beta-VAE Link

| IWAE (5 Samples) |Link | | |

TODO

  • VanillaVAE
  • Conditional VAE
  • Gamma VAE
  • Beta VAE
  • DFC VAE
  • InfoVAE (MMD-VAE)
  • WAE-MMD
  • AAE
  • TwoStageVAE
  • VAE-GAN
  • Vamp VAE
  • HVAE (VAE with Vamp Prior)
  • IWAE
  • VLAE
  • FactorVAE
  • PixelVAE

Contributing

If you have trained a better model using these implementations by finetuning the hyper-params in the config file, I would be happy to include your result (along with your config file) in this repo, citing your name &#1F607 .