ColossalAI icon indicating copy to clipboard operation
ColossalAI copied to clipboard

[Inference/Kernel] Add PagedAttetionv2: support seq length split across thread block

Open SunflowerAries opened this issue 9 months ago • 1 comments

📌 Checklist before creating the PR

  • [ ] I have created an issue for this PR for traceability
  • [ ] The title follows the standard format: [doc/gemini/tensor/...]: A concise description
  • [ ] I have added relevant tags if possible for us to better distinguish different PRs
  • [ ] I have installed pre-commit: pip install pre-commit && pre-commit install

🚨 Issue number

Link this PR to your issue with words like fixed to automatically close the linked issue upon merge

e.g. fixed #1234, closed #1234, resolved #1234

📝 What does this PR do?

Summarize your work here. if you have any plots/diagrams/screenshots/tables, please attach them here.

FlashDecodingAttention benchmarking results image

💥 Checklist before requesting a review

  • [ ] I have linked my PR to an issue (instruction)
  • [ ] My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
  • [ ] I have performed a self-review of my code
  • [ ] I have added thorough tests.
  • [ ] I have added docstrings for all the functions/methods I implemented

⭐️ Do you enjoy contributing to Colossal-AI?

  • [ ] 🌝 Yes, I do.
  • [ ] 🌚 No, I don't.

Tell us more if you don't enjoy contributing to Colossal-AI.

SunflowerAries avatar May 10 '24 08:05 SunflowerAries

Add Test Pic plz, also with performance pic

Courtesy-Xs avatar May 10 '24 09:05 Courtesy-Xs