ragflow icon indicating copy to clipboard operation
ragflow copied to clipboard

[Question]: Chat - list index out of range error

Open meetcharan-practice opened this issue 1 month ago • 4 comments

Self Checks

  • [x] I have searched for existing issues search for existing issues, including closed ones.
  • [x] I confirm that I am using English to submit this report (Language Policy).
  • [x] Non-english title submitions will be closed directly ( 非英文标题的提交将会被直接关闭 ) (Language Policy).
  • [x] Please do not modify this template :) and fill in all the required fields.

Describe your problem

Hello,

I started from scratch and rebuilt the RAGFlow image using the latest source. After deploying it in Docker, I was able to register a user, configure the LLMs, and create a Chat. However, when I try to interact with the Chat Assistant, I encounter the following error. Could you please advise?

Image

Traceback (most recent call last): File "/ragflow/.venv/lib/python3.10/site-packages/quart/app.py", line 1500, in full_dispatch_request result = await self.dispatch_request(request_context) File "/ragflow/.venv/lib/python3.10/site-packages/quart/app.py", line 1591, in dispatch_request self.raise_routing_exception(request_) File "/ragflow/.venv/lib/python3.10/site-packages/quart/app.py", line 1578, in raise_routing_exception raise request.routing_exception File "/ragflow/.venv/lib/python3.10/site-packages/quart/ctx.py", line 73, in match_request
) = self.url_adapter.match( # type: ignore File "/ragflow/.venv/lib/python3.10/site-packages/werkzeug/routing/map.py", line 629, in match
raise NotFound() from None werkzeug.exceptions.NotFound: 404 Not Found: The requested URL was not found on the server. If you entered the URL manually please check your spelling and try again. [2025-12-05 11:54:43 +0100] [24] [INFO] 127.0.0.1:54210 GET /v1/document/image/ 1.1 200 65 5974
2025-12-05 11:54:44,679 ERROR 24 LLMBundle.chat can't update token usage for 125c4164d1af11f0ad9046704e79750f/CHAT llm_name: gpt-4o@Azure-OpenAI, used_tokens: 1061 2025-12-05 11:54:45,488 ERROR 24 LLMBundle.chat can't update token usage for 125c4164d1af11f0ad9046704e79750f/CHAT llm_name: gpt-4o@Azure-OpenAI, used_tokens: 121 2025-12-05 11:54:46,007 INFO 24 POST http://es01:9200/ragflow_125c4164d1af11f0ad9046704e79750f/_search [status:200 duration:0.077s] 2025-12-05 11:54:46,123 INFO 24 POST http://es01:9200/ragflow_125c4164d1af11f0ad9046704e79750f/_search [status:200 duration:0.006s]

meetcharan-practice avatar Dec 05 '25 11:12 meetcharan-practice

The "list index out of range" error in the chat assistant is caused by unguarded list accesses in the backend chat logic. Specifically, the code assumes there is always at least one user message and at least one knowledge chunk retrieved from the knowledge base. If either is missing—such as when the chat is started with no user input, or if the knowledge base is empty or not properly configured—the backend tries to access elements that don't exist, triggering this error. This can also happen if the message list is malformed or filtered down to non-user entries, or if no knowledge is retrieved for the query.

To work around this, make sure your chat always includes at least one user message and that your knowledge base is properly set up and contains documents. Adding a dummy document to the knowledge base can help avoid the error if you're just testing. For a more robust fix, the backend should add guards to check for empty lists before accessing them, or return a user-friendly message when no knowledge or user input is available.

If you continue to see this error even with a populated knowledge base and valid user messages, double-check your chat message structure and knowledge base configuration.

You can see the relevant code and logic in the chat function here: dialog_service.py.

To reply, just mention @dosu.


How did I do? Good | Irrelevant | Incorrect | Verbose | Hallucination | Report 🐛 | Other

Chat with Dosu Open in Cursor Join Discord Share on X

dosubot[bot] avatar Dec 05 '25 11:12 dosubot[bot]

What about checking embedding and LLM configuration again? I doubt either one of them is faild to invoke.

KevinHuSh avatar Dec 08 '25 03:12 KevinHuSh

@KevinHuSh - Yes, it appears that recent changes have impacted the behavior when connecting the chat/agent through Azure OpenAI. The LLM is no longer responding as expected. When I connect the same LLMs through LM Studio, everything works fine, but the above error consistently appears in the trace.

meetcharan-practice avatar Dec 08 '25 07:12 meetcharan-practice

@KevinHuSh - I have added few more observations in my comment under https://github.com/infiniflow/ragflow/issues/11750#issuecomment-3624143338

meetcharan-practice avatar Dec 08 '25 10:12 meetcharan-practice

Solved in #11750

Magicbook1108 avatar Dec 17 '25 07:12 Magicbook1108