[Bug]: DeepSeek-R1 Output Truncated at 618 Characters Despite Max Token Setting of 10,000 in chat assistant
Is there an existing issue for the same bug?
- [x] I have checked the existing issues.
RAGFlow workspace code commit ID
38e551cc3d4
RAGFlow image version
v0.16.0-86-g38e551cc full
Other environment information
Ubuntu 22.04
Actual behavior
In the chat, I used the DeepSeek-R1 model and set the maximum output token limit to 10,000. However, when I asked a question, the answer was truncated at 618 characters, which is significantly fewer than 10,000 tokens.
Expected behavior
The expected answer output should not be truncated.
Steps to reproduce
1. configure model deepsee-r1 (Tian yi yun).
2. set max token 10000.
3. chat.
Additional information
No response
Max tokens is the conversation length including history. Please confirm whether the conversation history has consumed many tokens.
Is the conversation history assistant-based or chat-based?
Max tokens is the conversation length including history. Please confirm whether the conversation history has consumed many tokens.
This is my newly started conversation,the first question. The chat history should be empty.
I have same the problem. Do you have a answer for it, please? @qust5979