ragflow icon indicating copy to clipboard operation
ragflow copied to clipboard

[Bug]: DeepSeek-R1 Output Truncated at 618 Characters Despite Max Token Setting of 10,000 in chat assistant

Open qust5979 opened this issue 11 months ago • 4 comments

Is there an existing issue for the same bug?

  • [x] I have checked the existing issues.

RAGFlow workspace code commit ID

38e551cc3d4

RAGFlow image version

v0.16.0-86-g38e551cc full

Other environment information

Ubuntu 22.04

Actual behavior

In the chat, I used the DeepSeek-R1 model and set the maximum output token limit to 10,000. However, when I asked a question, the answer was truncated at 618 characters, which is significantly fewer than 10,000 tokens.

Image Image

Expected behavior

The expected answer output should not be truncated.

Steps to reproduce

1. configure model deepsee-r1 (Tian yi yun).
2. set max token 10000.
3. chat.

Additional information

No response

qust5979 avatar Feb 19 '25 10:02 qust5979

Max tokens is the conversation length including history. Please confirm whether the conversation history has consumed many tokens.

liwenju0 avatar Feb 20 '25 03:02 liwenju0

Is the conversation history assistant-based or chat-based?

vector- avatar Feb 21 '25 03:02 vector-

Max tokens is the conversation length including history. Please confirm whether the conversation history has consumed many tokens.

This is my newly started conversation,the first question. The chat history should be empty.

Image

qust5979 avatar Feb 21 '25 04:02 qust5979

I have same the problem. Do you have a answer for it, please? @qust5979

acc136 avatar Mar 01 '25 08:03 acc136