Qwen
Qwen copied to clipboard
你好,按照您的做法用vllm启动qwen-14Bchat 发现长度最多到2048,请问一下怎么修复呀,谢谢您[BUG] <title>
是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
- [X] 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions
该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?
- [X] 我已经搜索过FAQ | I have searched FAQ
当前行为 | Current Behavior
长度被限制在2048窗口
期望行为 | Expected Behavior
长度为8k窗口
复现方法 | Steps To Reproduce
No response
运行环境 | Environment
- OS:
- Python:
- Transformers:
- PyTorch:
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`):
备注 | Anything else?
No response
需要修改权重 repo config.json 里面的 seq_length 从 2048 改到 8192
请参考以下问题回复
- https://github.com/QwenLM/Qwen/issues/864
- https://github.com/QwenLM/Qwen/issues/849
- https://github.com/QwenLM/Qwen/issues/827
- https://github.com/QwenLM/Qwen/issues/658