ragflow icon indicating copy to clipboard operation
ragflow copied to clipboard

[Bug]: When the keyword match exceeds a certain length, the prompt is replaced by the system.

Open bbj677 opened this issue 10 months ago • 3 comments

Self Checks

  • [x] I have searched for existing issues search for existing issues, including closed ones.
  • [x] I confirm that I am using English to submit this report (Language Policy).
  • [x] Non-english title submitions will be closed directly ( 非英文标题的提交将会被直接关闭 ) (Language Policy).
  • [x] Please do not modify this template :) and fill in all the required fields.

RAGFlow workspace code commit ID

8e96504

RAGFlow image version

v0.17.1

Other environment information

deepseek r1 14b
nginx

Actual behavior

When the knowledge base parses the Excel file in table mode, and the matched data exceeds a certain length, the chatbot prompt is replaced by the system. As a result, the large language model is not triggered and returns the data directly.

Image

When the volume of data matched by the keyword is not large, the performance is normal, as shown in the figure:

Image

Image

Additionally, the conversion of Markdown is quite unstable. Sometimes it converts into an HTML table, and sometimes it doesn't.

Expected behavior

It can handle large Excel files with a large amount of data, and the performance is relatively normal.

Steps to reproduce

1. Create a knowledge base in table analysis mode.  
2. Import an Excel file containing 145 data entries.  
3. Write chatbot prompts and reference the knowledge base.  
4. Test the questions.

Additional information

No response

bbj677 avatar Mar 12 '25 02:03 bbj677

For Table parsed KB, the result is not responded by LLM. So, no prompt is needed. LLM just turns your question to ES SQL to select data out.

KevinHuSh avatar Mar 12 '25 04:03 KevinHuSh

For parsed KB, the result is not responded by LLM. So, no prompt is needed. LLM just turns your question to ES SQL to select data out.Table

Then why does the LLM generate additional content based on the prompt when the volume of matched data is small? Also, I don't want the query to return results directly. I'd like the LLM to analyze and summarize the knowledge base. What should I do?

bbj677 avatar Mar 12 '25 04:03 bbj677

What about parsing them via General?

KevinHuSh avatar Mar 12 '25 04:03 KevinHuSh