long-context-attention topic
List
long-context-attention repositories
flash-attention-jax
182
Stars
23
Forks
Watchers
Implementation of Flash Attention in Jax
block-recurrent-transformer-pytorch
212
Stars
19
Forks
Watchers
Implementation of Block Recurrent Transformer - Pytorch
RAN
20
Stars
3
Forks
Watchers
RAN: Recurrent Attention Networks for Long-text Modeling | Findings of ACL23