flash-attention-3 topic
List
flash-attention-3 repositories
Awesome-LLM-Inference
2.6k
Stars
175
Forks
Watchers
📖A curated list of Awesome LLM Inference Paper with codes, TensorRT-LLM, vLLM, streaming-llm, AWQ, SmoothQuant, WINT8/4, Continuous Batching, FlashAttention, PagedAttention etc.
CUDA-Learn-Notes
1.2k
Stars
133
Forks
Watchers
🎉 Modern CUDA Learn Notes with PyTorch: fp32, fp16, bf16, fp8/int8, flash_attn, sgemm, sgemv, warp/block reduce, dot, elementwise, softmax, layernorm, rmsnorm.