ColossalAI icon indicating copy to clipboard operation
ColossalAI copied to clipboard

[Inference]Add Streaming LLM

Open isky-cd opened this issue 1 year ago β€’ 0 comments

πŸ“Œ Checklist before creating the PR

  • [ ] I have created an issue for this PR for traceability
  • [x] The title follows the standard format: [doc/gemini/tensor/...]: A concise description
  • [ ] I have added relevant tags if possible for us to better distinguish different PRs
  • [x] I have installed pre-commit: pip install pre-commit && pre-commit install

🚨 Issue number

Link this PR to your issue with words like fixed to automatically close the linked issue upon merge

e.g. fixed #1234, closed #1234, resolved #1234

πŸ“ What does this PR do?

Summarize your work here. if you have any plots/diagrams/screenshots/tables, please attach them here.

Referencing https://arxiv.org/pdf/2309.17453, we integrated streaming LLM into our inference framework and optimized it based on the cache management mechanism of our inference framework.

Test: Question: "Introduce some landmarks in the United Kingdom, such as" Without streamingLLM image With streamingLLM image

πŸ’₯ Checklist before requesting a review

  • [ ] I have linked my PR to an issue (instruction)
  • [x] My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
  • [x] I have performed a self-review of my code
  • [x] I have added thorough tests.
  • [x] I have added docstrings for all the functions/methods I implemented

⭐️ Do you enjoy contributing to Colossal-AI?

  • [x] 🌝 Yes, I do.
  • [ ] 🌚 No, I don't.

Tell us more if you don't enjoy contributing to Colossal-AI.

isky-cd avatar May 23 '24 02:05 isky-cd