multi-head-attention topic

List multi-head-attention repositories

SentEncoding

16
Stars
6
Forks
Watchers

Sentence encoder and training code for Mean-Max AAE

Att-Induction

45
Stars
7
Forks
Watchers

Attention-based Induction Networks for Few-Shot Text Classification

datagrand_bert

20
Stars
5
Forks
Watchers

2019达观杯信息提取第5名代码

Multi2OIE

57
Stars
19
Forks
Watchers

Multi^2OIE: Multilingual Open Information Extraction Based on Multi-Head Attention with BERT (Findings of ACL: EMNLP 2020)

attention

39
Stars
10
Forks
Watchers

several types of attention modules written in PyTorch

scDINO

37
Stars
8
Forks
Watchers

Self-Supervised Vision Transformers for multiplexed imaging datasets

flash_attention_inference

20
Stars
2
Forks
Watchers

Performance of the C++ interface of flash attention and flash attention v2 in large language model (LLM) inference scenarios.

decoding_attention

17
Stars
1
Forks
Watchers

Decoding Attention is specially optimized for multi head attention (MHA) using CUDA core for the decoding stage of LLM inference.