Skip to content

zalandoresearch/gpa

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Grid Partitioned Attention: an efficient attention approximation with inductive bias for the image domain.

Code for the paper "Grid Partitioned Attention: Efficient TransformerApproximation with Inductive Bias for High Resolution Detail Generation", by Nikolay Jetchev, Gökhan Yildirim, Christian Bracher, Roland Vollgraf

The file GPAmodule.py contains the GPA layer definition, and an example how to apply on an image tensor.

TODO add the full generator architecture for pose morphing with attention copying from the paper.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages