multi-head-attention topic

List multi-head-attention repositories

DeepXi

487
Stars
125
Forks
Watchers

Deep Xi: A deep learning approach to a priori SNR estimation implemented in TensorFlow 2/Keras. For speech enhancement and robust ASR.

TranAD

469
Stars
141
Forks
Watchers

[VLDB'22] Anomaly Detection using Transformers, self-conditioning and adversarial training.

dodrio

328
Stars
30
Forks
Watchers

Exploring attention weights in transformer-based models with linguistic knowledge.

attentions

487
Stars
74
Forks
Watchers

PyTorch implementation of some attentions for Deep Learning Researchers.

Various-Attention-mechanisms

118
Stars
23
Forks
Watchers

This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras

Attention-Visualization

65
Stars
15
Forks
Watchers

Visualization for simple attention and Google's multi-head attention.

VRP_DRL_MHA

161
Stars
35
Forks
Watchers

"Attention, Learn to Solve Routing Problems!"[Kool+, 2019], Capacitated Vehicle Routing Problem solver

multi-head_self-attention

68
Stars
14
Forks
Watchers

A Faster Pytorch Implementation of Multi-Head Self-Attention

Diversify-MHA

18
Stars
2
Forks
Watchers

EMNLP 2018: Multi-Head Attention with Disagreement Regularization; NAACL 2019: Information Aggregation for Multi-Head Attention with Routing-by-Agreement

point-transformer

35
Stars
6
Forks
Watchers

This is the official repository of the original Point Transformer architecture.