PyTorch implementation of "Symmetric Parallax Attention for Stereo Image Super-Resolution", CVPRW 2021.
1. We develop a Siamese network equipped with a bi-directional PAM to super-resolve both left and right images.
2. We propose an inline occlusion handling scheme to deduce occlusions from parallax attention maps.
We share the quantitative and qualitative results achieved by our iPASSR on all the test sets for both 2xSR and 4xSR. Then, researchers can compare their algorithms to our method without performing inference. Results are available at Google Drive and Baidu Drive (Key: NUDT).
- PyTorch 1.3.0, torchvision 0.4.1. The code is tested with python=3.7, cuda=9.0.
- Matlab (For training/test data generation and performance evaluation)
- Download the training sets from Baidu Drive (Key: NUDT) and unzip them to
./data/train/
. - Run
./data/train/GenerateTrainingPatches.m
to generate training patches. - Run
train.py
to perform training. Checkpoint will be saved to./log/
.
- Download the test sets and unzip them to
./data
. Here, we provide the full test sets used in our paper on Google Drive and Baidu Drive (Key: NUDT). - Run
test.py
to perform a demo inference. Results (.png
files) will be saved to./results
. - Run
evaluation.m
to calculate PSNR and SSIM scores.
- The 2x/4x models of EDSR/RDN/RCAN retrained on stereo image datasets. Google Drive, Baidu Drive (Key: NUDT).
- The 2x/4x results of StereoSR. Google Drive, Baidu Drive (Key: NUDT).
- The 4x results of SRRes+SAM. Google Drive, Baidu Drive (Key: NUDT).
We hope this work can facilitate the future research in stereo image SR. If you find this work helpful, please consider citing:
@InProceedings{iPASSR,
author = {Wang, Yingqian and Ying, Xinyi and Wang, Longguang and Yang, Jungang and An, Wei and Guo, Yulan},
title = {Symmetric Parallax Attention for Stereo Image Super-Resolution},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2021},
pages = {766-775}
}
Any question regarding this work can be addressed to [email protected].