ragflow
ragflow copied to clipboard
[Bug]: When using the openai compatible model, the model supports and the maximum token is set to 128000, but it is still stuck within 8000 tokens and truncated.
Is there an existing issue for the same bug?
- [x] I have checked the existing issues.
RAGFlow workspace code commit ID
how to find it
RAGFlow image version
v0.16.0 full
Other environment information
window
Actual behavior
Expected behavior
No response
Steps to reproduce
RT
Additional information
No response
我也有个差不多的问题:解析知识库文件的时候,如果嵌入模型序列长度是512的,即使分块大小设置远小于512,请求模型时候还是很容易超过512导致解析失败
会不会是api提供方的限制?
Disable the max token.