Skip to content

Latest commit

 

History

History
49 lines (31 loc) · 2.01 KB

README.md

File metadata and controls

49 lines (31 loc) · 2.01 KB

AEVB Tutorial

Intro

PyTorch codebase for paper Training Latent Variable Models with Auto-encoding Variational Bayes: A Tutorial.

@misc{zhihan2022aevb,
  url = {https://arxiv.org/abs/2208.07818},
  author = {Zhi-Han, Yang},
  title = {Training Latent Variable Models with Auto-encoding Variational Bayes: A Tutorial},
  publisher = {arXiv},
  year = {2022}
}

In the tutorial, we motivate the Auto-encoding Variational Bayes (AEVB) algorithm from the classic Expectation Maximization (EM) algorithm, and then derive from scratch the AEVB training procedure for the following models:

This repo contains minimal PyTorch implementation of these models. Pre-trained models are included for all models except Factor Analysis (which takes less than 10 seconds to train) so it's easy to play around. All other models also take less than 30 minutes to train from scratch. To run the notebooks, create a conda environment, install the required packages with pip install -r requirements.txt, and you should be ready.

Visualizations

(All plots below are created using the notebooks in this repo. It's very likely to get better quality generations if you train longer; I didn't train the models to convergence to save time.)

Factor analysis

Variational Auto-Encoder

Conditional VAE

Gaussian Mixture VAE by Rui Shu (clusters ordered manually in the plot)

Variational RNN