transformer-attention topic

List transformer-attention repositories
trafficstars

multi-head_self-attention

70
Stars
14
Forks
Watchers

A Faster Pytorch Implementation of Multi-Head Self-Attention