[Question]: About variables in prompts
Self Checks
- [x] I have searched for existing issues search for existing issues, including closed ones.
- [x] I confirm that I am using English to submit this report (Language Policy).
- [x] Non-english title submitions will be closed directly ( 非英文标题的提交将会被直接关闭 ) (Language Policy).
- [x] Please do not modify this template :) and fill in all the required fields.
Describe your problem
Hi, I have a question. Suppose that I have two keys: {knowledge} and {A} in my prompt, I wonder how "A" could be assigned by RAGFlow in this situation.
RAGFlow uses a variables attribute within the Prompt object to manage variables in prompts. The knowledge variable is a reserved variable that represents the retrieved chunks from the knowledge base. However, for custom variables like {A}, the assignment process is not explicitly detailed in the available documentation. If {A} is not predefined in the configuration, it might require manual assignment or additional configuration to define its value. Unfortunately, specific details on how RAGFlow assigns values to such custom variables are not available in the current documentation [1][2].
To continue talking to Dosu, mention @dosu.
Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other
- In the Chat Configuration dialogue, switch to the Prompt Engine tab.
- In the Variable field, click Add to add your variable, say {A}.
- In the System prompt field, insert your newly-added variable where applicable.
- Use Converse with chat assistant and pass in the value for {A} as a separate request parameter.
Let me know if it does not work.
I'm running into a similar issue. I've added multiple variables into the prompt engine and have assigned these into the system prompt field.
This is the system prompt: You are an intelligent assistant. Please summarize the content of the knowledge base to answer the question. Please list the data in the knowledge base and answer in detail. When all knowledge base content is irrelevant to the question, your answer must include the sentence "The answer you are looking for is not found in the knowledge base!" Answers need to consider chat history and need to follow a few rules: The style of the generated answer: {style} The answer should respect the following restrictions: {restriction} The user is asking their questions because {purpose} Here is the knowledge base: {knowledge} The above is the knowledge base. The user has also provided extra documents that may be relevant to their questions {extra_document}
When sending my request using session.ask with the python API it does require the listed variables to properly send the request but from the generated responses I get the feeling that these variables are not being used properly.
Is there a way I can verify if these are properly being used in the system prompt?
@EeckhoutJens Please could you state your RAGFlow version?
Apologies, version is v0.17.2
For the latest successful dialogue, a light bulb will appear above the latest successful dialogue. Click the light bulb icon and you will see what prompt was sent to your model. See the screenshot in https://ragflow.io/docs/dev/accelerate_question_answering. Hope this helps.
Thanks for the additional info.
Is there another way to access the output of clicking the lightbulb icon?
The problem I have is that the lightbulb icon is only present immediately after the model has generated it's response. If I refresh the chat page or go into RAGFlow application after sending the prompt through the API the lightbulb never shows up even if it's the latest succesful dialogue in that specific chat.
Light bulb shows only while chating via UI.
I have the same issue on v0.19.1,Is this issue resolved?