Skip to content

Latest commit

 

History

History
9 lines (7 loc) · 651 Bytes

README.md

File metadata and controls

9 lines (7 loc) · 651 Bytes

DBTC

Exploiting Multi-scale Parallel Self-attention and Local Variation via Dual-branch Transformer-CNN Structure for Face Super-resolution

This is the origin Pytorch implementation of Exploiting Multi-scale Parallel Self-attention and Local Variation via Dual-branch Transformer-CNN Structure for Face Super-resolution. Code is coming soon. Overall_structue pic results