multi-head-self-attention topic
List
multi-head-self-attention repositories
multi-head_self-attention
70
Stars
14
Forks
Watchers
A Faster Pytorch Implementation of Multi-Head Self-Attention
Transformer-in-PyTorch
26
Stars
7
Forks
Watchers
Transformer/Transformer-XL/R-Transformer examples and explanations
EEG-ATCNet
155
Stars
19
Forks
Watchers
Attention temporal convolutional network for EEG-based motor imagery classification
BabyGPT-Build_GPT_From_Scratch
41
Stars
11
Forks
Watchers
BabyGPT: Build Your Own GPT Large Language Model from Scratch Pre-Training Generative Transformer Models: Building GPT from Scratch with a Step-by-Step Guide to Generative AI in PyTorch and Python