ChatGLM-6B
ChatGLM-6B copied to clipboard
如何增加max_new_tokens?
Is there an existing issue for this?
- [X] I have searched the existing issues
Current Behavior
Input length of input_ids is 2183, but max_length
is set to 2048. This can lead to unexpected behavior. You should consider increasing max_new_tokens
.
之後回答便非常簡短, 但是我找不到如何增加max_new_tokens
Expected Behavior
No response
Steps To Reproduce
run web_demo.py
Environment
- OS: Windows 11
- Python:
- Transformers:
- PyTorch:
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :
Anything else?
No response
model.chat(tokenizer, query, history, max_length=xxxx)
Duplicate of #17