Skip to content

Implementation of TransGanFormer, an all-attention GAN that combines the finding from the recent GanFormer and TransGan paper

License

Notifications You must be signed in to change notification settings

Iceland-Leo/transganformer

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

TransGanFormer (wip)

Implementation of TransGanFormer, an all-attention GAN that combines the finding from the recent GansFormer and TransGan paper. It will also contain a bunch of tricks I have picked up building transformers and GANs for the last year or so, including efficient linear attention and pixel level attention.

Citations

@misc{jiang2021transgan,
    title   = {TransGAN: Two Transformers Can Make One Strong GAN}, 
    author  = {Yifan Jiang and Shiyu Chang and Zhangyang Wang},
    year    = {2021},
    eprint  = {2102.07074},
    archivePrefix = {arXiv},
    primaryClass = {cs.CV}
}
@misc{hudson2021generative,
    title   = {Generative Adversarial Transformers}, 
    author  = {Drew A. Hudson and C. Lawrence Zitnick},
    year    = {2021},
    eprint  = {2103.01209},
    archivePrefix = {arXiv},
    primaryClass = {cs.CV}
}

About

Implementation of TransGanFormer, an all-attention GAN that combines the finding from the recent GanFormer and TransGan paper

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published