multihead-attention topic
awesome-fast-attention
list of efficient attention modules
multihead-siamese-nets
Implementation of Siamese Neural Networks built upon multihead attention mechanism for text semantic similarity task.
multi-head_self-attention
A Faster Pytorch Implementation of Multi-Head Self-Attention
Transformer
Chatbot using Tensorflow (Model is transformer) ko
Advanced_Models
여러가지 유명한 신경망 모델들을 제공합니다. (DCGAN, VAE, Resnet 등등)
pytorch-transformer
Implementation of "Attention is All You Need" paper
multi-head-attention-labeller
Joint text classification on multiple levels with multiple labels, using a multi-head attention mechanism to wire two prediction tasks together.
TransformerX
Flexible Python library providing building blocks (layers) for reproducible Transformers research (Tensorflow ✅, Pytorch 🔜, and Jax 🔜)
awesome-Transformers-For-Segmentation
Semantic segmentation is an important job in computer vision, and its applications have grown in popularity over the last decade.We grouped the publications that used various forms of segmentation in...