Skip to content

A Collection of Variational Autoencoders (VAE) in PyTorch.

License

Notifications You must be signed in to change notification settings

yongduek/PyTorch-VAE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PyTorch VAE

A Collection of Variational AutoEncoders (VAEs) implemented in PyTorch with focus on reproducibility. The aim of this project is to provide a quick and simple working example for many of the cool VAE models out there.

Requirements

  • Python >= 3.5
  • PyTorch >= 1.3
  • Pytorch Lightning >= 0.5.3 (GitHub Repo)

Installation

$ git clone https://github.com/AntixK/PyTorch-VAE
$ cd PyTorch-VAE
$ pip install -r requirements.txt

Usage

$ cd PyTorch-VAE
$ python run.py -c configs/<config-file-name.yaml>

Model Paper Reconstruction Samples
VAE Link
WAE - MMD (RBF Kernel) Link
WAE - MMD (IMQ Kernel) Link
Beta-VAE Link

| IWAE (5 Samples) |Link | | |

TODO

  • VanillaVAE
  • Conditional VAE
  • Gamma VAE
  • Beta VAE
  • DFC VAE
  • InfoVAE (MMD-VAE)
  • WAE-MMD
  • AAE
  • TwoStageVAE
  • VAE-GAN
  • Vamp VAE
  • HVAE (VAE with Vamp Prior)
  • IWAE
  • VLAE
  • FactorVAE
  • PixelVAE

Contributing

If you have trained a better model using these implementations by finetuning the hyper-params in the config file, I would be happy to include your result (along with your config file) in this repo, citing your name &#1F607 .

About

A Collection of Variational Autoencoders (VAE) in PyTorch.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%