flash-attention-3 topic
List
flash-attention-3 repositories
Awesome-LLM-Inference
4.9k
Stars
330
Forks
4.9k
Watchers
📚A curated list of Awesome LLM/VLM Inference Papers with Codes: Flash-Attention, Paged-Attention, WINT8/4, Parallelism, etc.🎉
CUDA-Learn-Notes
1.2k
Stars
133
Forks
Watchers
🎉 Modern CUDA Learn Notes with PyTorch: fp32, fp16, bf16, fp8/int8, flash_attn, sgemm, sgemv, warp/block reduce, dot, elementwise, softmax, layernorm, rmsnorm.
flash-attention3-wheels
19
Stars
0
Forks
19
Watchers
Pre-built wheels that erase Flash Attention 3 installation headaches.