axial-attention
axial-attention copied to clipboard
Implementation of Axial attention - attending to multi-dimensional data efficiently
> import torch >from axial_attention import AxialAttention >img = torch.randn(1, 3, 256, 256) >attn = AxialAttention( dim = 3, # embedding dimension dim_index = 1, # where is the embedding...
Any examples of sampling / training?
Hi there, Excellent project! I'm using axial-attention with video (1, 5, 128, 256, 256) and `sum_axial_out=True`, and I wish to visualise the attention maps. Essentially, given my video, and two...
Thanks for the sharing. This is great job! As tensorflow is another major framework widely used in production environments, is there a tf version for the work?
hi, thanks for your effort and great work, I am working on the 3D image, I want to apply axial attention to the transformer, I am wondering if can i...