Skip to content

Exploiting Multi-scale Parallel Self-attention and Local Variation via Dual-branch Transformer-CNN Structure for Face Super-resolution

Notifications You must be signed in to change notification settings

jingang-cv/DBTC

Repository files navigation

DBTC

Exploiting Multi-scale Parallel Self-attention and Local Variation via Dual-branch Transformer-CNN Structure for Face Super-resolution

This is the origin Pytorch implementation of Exploiting Multi-scale Parallel Self-attention and Local Variation via Dual-branch Transformer-CNN Structure for Face Super-resolution. Code is coming soon. Overall_structue pic results

About

Exploiting Multi-scale Parallel Self-attention and Local Variation via Dual-branch Transformer-CNN Structure for Face Super-resolution

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published