A collection of generative algorithms and techniques implemented in Python.
This work is part of the Xecs TASTI project, nr. 2022005.
You will need:
python
(seepyproject.toml
for full version)Git
Make
- a
.secrets
file with the required secrets and credentials - load environment variables from
.env
NVIDIA Drivers
(mandatory) andCUDA >= 12.1
(mandatory if Docker is not used)Weights & Biases
account
Clone this repository (requires git ssh keys)
git clone --recursive [email protected]:caetas/GenerativeZoo.git
cd generativezoo
Create the image using the provided Dockerfile
and then run the container:
docker build --tag generativezoo .
docker create --gpus all --shm-size=1g -i --name generativezoo_container generativezoo
docker start generativezoo_container
To access the shell, please run:
docker exec -it generativezoo_container /bin/bash
Note: Edit the Dockerfile
if you want to include data or model checkpoints in your image.
or if environment already exists
conda env create -f environment.yml
conda activate python3.10
And then setup all virtualenv using make file recipe
(python3.10) $ make setup-all
You might be required to run the following command once to setup the automatic activation of the conda environment and the virtualenv:
direnv allow
Feel free to edit the .envrc
file if you prefer to activate the environments manually.
You can setup the virtualenv by running the following commands:
python -m venv .venv-dev
.venv-dev/Scripts/Activate.ps1
python -m pip install --upgrade pip setuptools
pip install torch torchvision --index-url https://download.pytorch.org/whl/cu121
python -m pip install -r requirements/requirements-win.txt
To run the code please remember to always activate both environments:
conda activate python3.10
.venv-dev/Scripts/Activate.ps1
The listed models are already implemented and fully integrated in the model zoo.
- Vanilla VAE
Paper
|Code
1 |Script
|Documentation
- Conditional VAE
Paper
|Code
1 |Script
|Documentation
- Hierarchical VAE
Paper
|Code
2 |Script
|Documentation
- Adversarial VAE
Paper
|Code
1 |Script
|Documentation
- Vanilla GAN
Paper
|Code
3 |Script
|Documentation
- Conditional GAN
Paper
|Code
3 |Script
|Documentation
- CycleGAN
Paper
|Code
4 |Script
|Documentation
- Prescribed GAN
Paper
|Code
5 |Script
|Documentation
- Wasserstein GAN with Gradient Penalty
Paper
|Code
6 |Script
|Documentation
- Vanilla DDPM
Paper
|Code
7,8,9 |Script
|Documentation
- Conditional DDPM
Paper
|Code
10 |Script
|Documentation
- Diffusion AE
Paper
|Code
11 |Script
|Documentation
- Vanilla SGM
Paper
|Code
12 |Script
|Documentation
- NCSNv2
Paper
|Code
13 |Script
|Documentation
- VQ-VAE + Transformer
Paper
|Code
11 |Script
|Documentation
- VQ-GAN + Transformer
Paper
|Code
11 |Script
|Documentation
- PixelCNN
Paper
|Code
14 |Script
|Documentation
- Vanilla Flow
Paper
|Code
14 |Script
|Documentation
- RealNVP
Paper
|Code
15 |Script
|Documentation
- Glow
Paper
|Code
16 |Script
|Documentation
- Flow++
Paper
|Code
17 |Script
|Documentation
- Flow Matching
Paper
|Code
18,21 |Script
|Documentation
- Conditional Flow Matching
Paper
|Code
8,18 |Script
|Documentation
- Rectified Flows
Paper
|Code
19 |Script
|Documentation
- Stable Diffusion + LoRA
Paper
|Code
20 |Script
|Documentation
- ControlNet
Paper
|Code
20 |Script
|Documentation
- InstructPix2Pix
Paper
|Code
20 |Script
|Documentation
The following datasets are ready to be used to train and sample from the provided models. They are automatically downloaded when you try to use them for the first time.
- MNIST
Source
- FashionMNIST
Source
- ChestMNIST++
Source
- OctMNIST++
Source
- PneumoniaMNIST++
Source
- TissueMNIST++
Source
- CIFAR-10
Source
- CIFAR-100
Source
- SVHN
Source
- Places365
Source
- DTD
Source
- TinyImageNet
Source
MANUAL DOWNLOAD REQUIREDLink
- Horse2Zebra
Source
MANUAL DOWNLOAD REQUIREDLink
- ImageNet-1k
Source
The code examples are setup to use Weights & Biases as a tool to track your training runs. Please refer to the full documentation
if required or follow the following steps:
-
Create an account in Weights & Biases
-
If you have installed the requirements you can skip this step. If not, activate the conda environment and the virtualenv and run:
pip install wandb
-
Run the following command and insert you
API key
when prompted:wandb login
If you want to fully disable Weights & Biases during training, use the flag --no_wandb
.
If you want to turn off syncing with the server but want to retain the local copy, run:
wandb offline
Full documentation is available here: docs/
.
See the Developer guidelines for more information.
Contributions of any kind are welcome. Please read CONTRIBUTING.md for details and the process for submitting pull requests to us.
Please read MODELRULES.md for details on how you should build your models for this repository.
See the Changelog for more information.
Thank you for improving the security of the project, please see the Security Policy for more information.
This project is licensed under the terms of the CC-BY-4.0
license.
See LICENSE for more details.
All the repositories used to generate this code are mentioned in each of the corresponding files and referenced in Implemented Models
:
- PyTorch-VAE
- nvae
- conditional-GAN
- PyTorch-GAN
- PresGANs
- wgan-gp
- minDiffusion
- DenoisingDiffusionProbabilisticModels
- ddim
- Conditional_Diffusion_MNIST
- Generative Models
- SGM Tutorial
- score_sde_pytorch
- uvadlc_notebooks
- real-nvp
- Glow-PyTorch
- flowplusplus
- conditional-flow-matching
- minRF
- diffusers
- guided-diffusion
If you publish work that uses GenerativeZoo, please cite GenerativeZoo as follows:
@misc{GenerativeZoo,
author = {Francisco Caetano},
title = {A collection of generative algorithms and techniques implemented in Python.},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/caetas/GenerativeZoo}},
year = {2024},
}