Flowise
Flowise copied to clipboard
[Question] Differences among Conversational Retrieval QA to Conversational Retrieval Agent
Describe the bug I'm having a chatflow based on Conversational Retrieval QA, OpenAI and Pinecone as vector store. The flows works smoothly as I expect. Later on, I wanted to add a custom tool to that flow so I needed to convert it to a chatflow based on Conversational Retrieval Agent (See screenshots below) and Retrieval Tool. When hitting the same question, the results were totally different conceptually.
I want to emphasize that the OpenAI temperature and all other parameters were the same in both flows
To Reproduce Steps to reproduce the behavior:
- Build chatflow1
- Build chatflow2
- Upsert the same data on both chatflows
- Hit a question which triggers a retrieval from vector store (pinecone)
- Watch differences
Expected behavior The same results (conceptually) should be the same on both flows.
Screenshots
chat flow 1 (good)
VS.
chat flow 2 (not good)
Let me know what I miss and how to fix it. By the end of the road all I want is chatflow1 + tool.
@HenryHengZJ please help
Thanks
I see two issues:
- Outdated Node: Your Flow 2 is using an outdated node.
- Different Default Messages: The Conversational Retrieval QA and the new Conversational Agent use very different default system messages.
FLow 1
Flow 2 (rare default message - considering this node can be used for gemini, mistral, and others)
-- However, I understand your point. The new conversational agent doesn't work as I'd like either.
Let's ask to @HenryHengZJ if it's a misconfiguration or just a limitation.
I see two issues:
- Outdated Node: Your Flow 2 is using an outdated node.
- Different Default Messages: The Conversational Retrieval QA and the new Conversational Agent use very different default system messages.
FLow 1
## Flow 2 (rare default message - considering this node can be used for gemini, mistral, and others)
-- However, I understand your point. The new conversational agent doesn't work as I'd like either.
Let's ask to @HenryHengZJ if it's a misconfiguration or just a limitation.
@toi500 thanks for your reply
- I was using an outdated node of conventional retrieval agent since I didn't see updated version although I'm using the latest version of flowise. Let me know if I miss anything
- Either way, I don't think that would solve the issue
- I was deep diving and I think I know the root cause. The difference is that on the conventional retrieval QA's the query to open ai contains the system message + the data retrived from pinecone. Where on the convertional agent it doesn't contain the system message that's why open ai returns an output very similar to its input.
I believe that's a major bug on the conventional agent
@HenryHengZJ let me know if we miss anything and if there's any workaround to resolve this, or in general if there's any other way we can use the conversational retrieval QA along with custom tool.
Thanks
The new version of the Conversational Retrieval Agent
is the new Conversational Agent
if I am not getting this wrong.
Did you try this?
Instead of just attaching the Pinecone node through the Retriever Tool
(like in Flow 2), use the Chain Tool
to incorporate the entire chain.
--
The new version of the
Conversational Retrieval Agent
is the newConversational Agent
if I am not getting this wrong.Did you try this?
Instead of just attaching the Pinecone node through the
Retriever Tool
(like in Flow 2), use theChain Tool
to incorporate the entire chain.--
[0001 Chatflow.json](https://github.com/FlowiseAI/Flowise/files/15289174/0001.Chatflow.json)
@toi500
Thanks I was trying that either, I think the conversational retrieval QA is not made for this purpose, I get the following error:
"chain.run is not a function"
any luck with Tool Agent?
any luck with Tool Agent?
Tried it as well same behavior
It looks like that the Chain Tool
is bugged or outdated too.
https://github.com/FlowiseAI/Flowise/issues/2400#issue-2293095204
@toi500 @HenryHengZJ any news? It's blocking us
I don't believe it's possible to achieve the same level of comprehensive response from the Agent using the Retrieve Tool, as you're attempting in flow 2. What the Agent is doing there is executing a vector similarity search in Pinecone without the proper context.
The only way to obtain similar results as the flow 1 is to instruct the Agent to utilize the whole chain as a tool, and this, if I am not getting this wrong, can only be achieved via the Chain Tool, bugged at the moment.
How about adding a HyDe Retriever? It doesn't yield the same results but you could apply a prompt on the documents returned.
Otherwise very interesting detail shared here I was also not happy with the retriever tool only.
@niztal any news about ur pull request? Can wait to test it out.
I know kow the team is busy with the implementation of LangGraph (https://github.com/FlowiseAI/Flowise/pull/2319/commits/0718b1cc73f8af1b69b806cffb32fd5c78070156), which is amazing by the way, but I hope ur new agent comes to light too.
@niztal any news about ur pull request? Can wait to test it out.
I know kow the team is busy with the implementation of LangGraph (0718b1c), which is amazing by the way, but I hope ur new agent comes to light too.
@toi500 unfortunately no, I'm still waiting for @HenryHengZJ to review that.
Either way I'm using it and it works perfect!
Any news? I see the feature is commited and closed but can't find it on Flowise 1.8.2. Also I see lots of OpenAI in the new node name. Will I be able to use it with local Ollama?