hard-attention topic

List hard-attention repositories

MoChA-pytorch

75
Stars
19
Forks
Watchers

PyTorch Implementation of "Monotonic Chunkwise Attention" (ICLR 2018)

Attention

41
Stars
7
Forks
Watchers

Repository for Attention Algorithm