ChatGLM-6B icon indicating copy to clipboard operation
ChatGLM-6B copied to clipboard

如何增加max_new_tokens?

Open Dfaker-HK opened this issue 1 year ago • 1 comments

Is there an existing issue for this?

  • [X] I have searched the existing issues

Current Behavior

Input length of input_ids is 2183, but max_length is set to 2048. This can lead to unexpected behavior. You should consider increasing max_new_tokens. 之後回答便非常簡短, 但是我找不到如何增加max_new_tokens

Expected Behavior

No response

Steps To Reproduce

run web_demo.py

Environment

- OS: Windows 11
- Python:
- Transformers:
- PyTorch:
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :

Anything else?

No response

Dfaker-HK avatar Mar 15 '23 17:03 Dfaker-HK

model.chat(tokenizer, query, history, max_length=xxxx)

JiweiZh avatar Mar 21 '23 05:03 JiweiZh

Duplicate of #17

zhangch9 avatar Aug 15 '23 11:08 zhangch9