Skip to content

CVPR2023: Vector Quantization with Self-Attention for Quality-Independent Representation Learning.

License

Notifications You must be signed in to change notification settings

yangzhou321/VQSA

Repository files navigation

VQSA

Paper: CVPR2023: Vector Quantization with Self-Attention for Quality-Independent Representation Learning. This is the official implementation of the above paper in PyTorch.

Please note that we have recently found its performance to be quite satisfactory when the codebook size N = 1000. Considering its low parameter count and computational complexity, we recommend set N = 1000 for use. Additionally, we also have provided pre-trained checkpoints for reproduction. Google Drive. You can download the ckpt file and put it in checkpoints/SA.

Usage

For installing the package imagecorruptions you can simply run this command below:

pip install imagecorruptions

About

CVPR2023: Vector Quantization with Self-Attention for Quality-Independent Representation Learning.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages