flash-attention-2 topic

List flash-attention-2 repositories

Awesome-LLM-Inference

4.9k
Stars
330
Forks
4.9k
Watchers

📚A curated list of Awesome LLM/VLM Inference Papers with Codes: Flash-Attention, Paged-Attention, WINT8/4, Parallelism, etc.🎉

Shush

173
Stars
27
Forks
Watchers

Shush is an app that deploys a WhisperV3 model with Flash Attention v2 on Modal and makes requests to it via a NextJS app

CUDA-Learn-Notes

1.2k
Stars
133
Forks
Watchers

🎉 Modern CUDA Learn Notes with PyTorch: fp32, fp16, bf16, fp8/int8, flash_attn, sgemm, sgemv, warp/block reduce, dot, elementwise, softmax, layernorm, rmsnorm.

flash_attention_inference

20
Stars
2
Forks
Watchers

Performance of the C++ interface of flash attention and flash attention v2 in large language model (LLM) inference scenarios.

flashattention2-custom-mask

65
Stars
5
Forks
Watchers

Triton implementation of FlashAttention2 that adds Custom Masks.