[Question]: what's the intermediate prompt
Describe your problem
I am using the chat feature and I am wondering what the content of the variable knowledge is in the system prompt. Are they the original chunks retrieved from the knowledge base?
In this case, I want to know where I can find the real intermediate prompt or which code is responsible for building the prompt.
Thank you!
Hiya, {knowledge} is a hard-coded variable, which refers to all chunks in your chosen knowledge base(s). In other words, with this prompt, chunks from your chosen knowledge base(s) will be fed to the chat model (LLM).
Hiya, {knowledge} is a hard-coded variable, which refers to all chunks in your chosen knowledge base(s). In other words, with this prompt, chunks from your chosen knowledge base(s) will be fed to the chat model (LLM).
Thanks for your reply.
I have another question. Since my chunk size is large, then if the total length of retrieved chunks is larger than the max token my LLM can take, will Ragflow directly truncate the retrieved chunks?
It will.