ragflow
ragflow copied to clipboard
[Bug]: With max_token disabled, output is still truncated
Describe your problem
If the ollama client is directly used, it is not truncated
@KevinHuSh How to solve this problem?
A truncated chat
I encountered the same problem
I have tested different versions and this is a new bug introduced in v0.17.0
It retrieved a lot of content while answering. So, may be caused by context length. Apart from it, it may lose connection to Ollama because of slow computations.
我也遇到了同样的问题,另外,你的这个思考过程为什么可以放到引用块里呢?请问是怎么做到的?可以分享一下吗?
截断的聊天