Code for Invertible DenseNets.
This work is inspired by Invertible Residual Networks and Residual Flows. The source is adapted from Residual Flows.
A BibTeX entry for LaTeX users:
@misc{perugachidiaz2021invertible,
title={Invertible DenseNets with Concatenated LipSwish},
author={Yura Perugachi-Diaz and Jakub M. Tomczak and Sandjai Bhulai},
year={2021},
eprint={2102.02694}
}
- Python (tested with 3.7)
- PyTorch (tested with 1.5.0)
- CIFAR10 is automatically downloaded.
- The pre-processing steps and downloading of ImageNet32 are described in Residual Flows.
Default settings of i-DenseNets depth and growth are applicable for both CIFAR10 and ImageNet32.
python train_img.py --data cifar10 --nblocks 16-16-16 --save experiments/cifar10 --densenet True --learnable_concat True --start_learnable_concat 25 --act CLipSwish --densenet_depth 3 --densenet_growth 172
Code for the smaller architecture:
python train_img.py --data cifar10 --nblocks 4-4-4 --save experiments/cifar10_small --densenet True --learnable_concat True --start_learnable_concat 25 --act CLipSwish --densenet_depth ? --densenet_growth ?
where ? ? can be replaced with the following depth and growth sizes to utilize a similar number of parameters as the smaller Residual Flow architecture:
--densenet_depth 2 --densenet_growth 318
--densenet_depth 3 --densenet_growth 178
(optimal architecture)--densenet_depth 4 --densenet_growth 122
--densenet_depth 5 --densenet_growth 92
python train_img.py --data imagenet32 --nblocks 32-32-32 --save experiments/imagenet32 --densenet True --learnable_concat True --start_learnable_concat 0 --act CLipSwish --densenet_depth 3 --densenet_growth 172