Skip to content
forked from yinboc/liif

Learning Continuous Image Representation with Local Implicit Image Function

License

Notifications You must be signed in to change notification settings

huoshuai-dot/liif

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LIIF

This repository contains the official implementation for LIIF introduced in the following paper:

Learning Continuous Image Representation with Local Implicit Image Function

Yinbo Chen, Sifei Liu, Xiaolong Wang

The project page with video is at https://yinboc.github.io/liif/.

Citation

If you find our work useful in your research, please cite:

@article{chen2020learning,
  title={Learning Continuous Image Representation with Local Implicit Image Function},
  author={Chen, Yinbo and Liu, Sifei and Wang, Xiaolong},
  journal={arXiv preprint arXiv:2012.09161},
  year={2020}
}

Environment

  • Python 3
  • Pytorch 1.6.0
  • TensorboardX
  • yaml, numpy, tqdm, imageio

Quick Start

  1. Download a DIV2K pre-trained model.
Model File size Download
EDSR-baseline-LIIF 18M Dropbox | Google Drive
RDN-LIIF 256M Dropbox | Google Drive
  1. Convert your image to LIIF and present it in a given resolution (with GPU 0, [MODEL_PATH] denotes the .pth file)
python demo.py --input xxx.png --model [MODEL_PATH] --resolution [HEIGHT],[WIDTH] --output output.png --gpu 0

Reproducing Experiments

Data

mkdir load for putting the dataset folders.

  • DIV2K: mkdir and cd into load/div2k. Download HR images and bicubic validation LR images from DIV2K website (i.e. Train_HR, Valid_HR, Valid_LR_X2, Valid_LR_X3, Valid_LR_X4). unzip these files to get the image folders.

  • benchmark datasets: cd into load/. Download and tar -xf the benchmark datasets (provided by this repo), get a load/benchmark folder with sub-folders Set5/, Set14/, B100/, Urban100/.

  • celebAHQ: mkdir load/celebAHQ and cp scripts/resize.py load/celebAHQ/, then cd load/celebAHQ/. Download and unzip data1024x1024.zip from the Google Drive link (provided by this repo). Run python resize.py and get image folders 256/, 128/, 64/, 32/. Download the split.json.

Running the code

0. Preliminaries

  • For train_liif.py or test.py, use --gpu [GPU] to specify the GPUs (e.g. --gpu 0 or --gpu 0,1).

  • For train_liif.py, by default, the save folder is at save/_[CONFIG_NAME]. We can use --name to specify a name if needed.

  • For dataset args in configs, cache: in_memory denotes pre-loading into memory (may require large memory, e.g. ~40GB for DIV2K), cache: bin denotes creating binary files (in a sibling folder) for the first time, cache: none denotes direct loading. We can modify it according to the hardware resources before running the training scripts.

1. DIV2K experiments

Train: python train_liif.py --config configs/train-div2k/train_edsr-baseline-liif.yaml (with EDSR-baseline backbone, for RDN replace edsr-baseline with rdn). We use 1 GPU for training EDSR-baseline-LIIF and 4 GPUs for RDN-LIIF.

Test: bash scripts/test-div2k.sh [MODEL_PATH] [GPU] for div2k validation set, bash scripts/test-benchmark.sh [MODEL_PATH] [GPU] for benchmark datasets. [MODEL_PATH] is the path to a .pth file, we use epoch-last.pth in corresponding save folder.

2. celebAHQ experiments

Train: python train_liif.py --config configs/train-celebAHQ/[CONFIG_NAME].yaml.

Test: python test.py --config configs/test/test-celebAHQ-32-256.yaml --model [MODEL_PATH] (or test-celebAHQ-64-128.yaml for another task). We use epoch-best.pth in corresponding save folder.

About

Learning Continuous Image Representation with Local Implicit Image Function

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 95.0%
  • Shell 5.0%