langchain
langchain copied to clipboard
add single execution option for ConversationalRetrievalChain
add single execution option for ConversationalRetrievalChain
- The current default workflow for ConversationalRetrievalChain is to rephrase the question based on the chat history and then use the rephrased question to perform retrieval and question answering.
- This current two-step process has pros and cons. It helps address the context size limit but it could increase unnecessary latency and incur potential information loss.
- We can make the
question_generator(chain) argument optional to allow it to bypass the rephrasing step and use a custom prompt to combine thechat_history,contextandquestiontogether for single execution. This could give users more flexibility based on their use cases and requirements.
Fixes #5123
Before submitting
I updated the notebook to include the ## ConversationalRetrievalChain with only custom combine_docs_chain section
Who can review?
Community members can review the PR once tests pass. Tag maintainers/contributors who might be interested: @hwchase17 @dev2049
I believe this feature provides practical values. Can I get a review on this one? @hwchase17 @dev2049
Hello guys, we're having a similar issue with the ConversationalRetrievalQAChain class.
When adding a Pinecone vector store for memory, it can even reach 45 seconds of total response time, which is far from ideal. This PR does seem to have the potential to mitigate the issue, is it being actively looked at?
Thanks in advance.
i think we need just better documentation around conversational retrieval chains in general
i think we need just better documentation around conversational retrieval chains in general
I have updated the notebook to show how to set question_generator as None and use a custom prompt. But feel free to let me know anything else needed to get this PR merged. Thanks!
Hoping this can get merged soon since I need this patch for a work project! @hwchase17 @dev2049 any feedback?
👀👀👀
The latest updates on your projects. Learn more about Vercel for Git ↗︎
1 Ignored Deployment
| Name | Status | Preview | Comments | Updated (UTC) |
|---|---|---|---|---|
| langchain | ⬜️ Ignored (Inspect) | Visit Preview | Mar 6, 2024 0:39am |
@dev2049 @hwchase17 can you take a look at this PR? Sorry for chasing..
How can I implement this solution in my project?
A must needed feature. I am looking for this.
@jpzhangvincent Hi , could you, please, resolve the merging issues and address the last comments (if needed)? After that, ping me and I push this PR for the review. Thanks!
@jpzhangvincent It is again :( Could you, please, resolve the merging issues? After that ping me and I push this PR for the review. Thanks!
@jpzhangvincent It is again :( Could you, please, resolve the merging issues? After that ping me and I push this PR for the review. Thanks!
@leo-gan I believe I have fixed the merge conflicts. I tried to add a test or an example in the notebook(if needed) but not sure where/which file to add specifically. Open to suggestion!
@jpzhangvincent There are failed unit tests and pydantic compatibility tests. When you fix them we will be ready.
@leo-gan I believe I had resolved the test issues.. please review again and feel free to make code changes directly!
@dev2049 @baskaryan can we get a merge for this? sorry for jumping into a conversation. But waiting to use this feature asap in Prod code.
@baskaryan Could you, please, review it? Looks like it urgent for users.
So any update on this PR? Thanks. - @leo-gan @baskaryan
slow for me as well. 30 seconds
When will it be fixed?This problem is very urgent。 @leo-gan @baskaryan
just adding myself to the chorus of those begging for this PR to be merged @leo-gan @baskaryan
@baskaryan @leo-gan Okay I have addressed the feedback and added a unit test. Hopeful we can get this merged soon since it could be low-hanging fruit but useful feature.
@baskaryan @leo-gan Okay I have addressed the feedback and added a unit test. Hopeful we can get this merged soon since it could be low-hanging fruit but useful feature.
@baskaryan @leo-gan @hwchase17 thoughts on merging this PR?
Can you please review again? Thanks! @baskaryan @leo-gan
Thank you for this. Closing because this is readily achievable with create_retrieval_chain. See the example snippet in the API reference. You can also reference this guide on migrating from ConversationalRetrievalChain to LCEL implementations. Please let me know if you have any other concerns.