langchain
langchain copied to clipboard
Can I use vectorstore with LLMChain?
Hi!
Trying to build a chat with openai chatgpt that can make use of info from my own documents. If I use LLMChain the chat behaves exactly like in openai web interface, I get the same high quality answers. However there seams no way of implementing LLmChain with vectorstores so I can get it to include my documents?
If I try to use ConversationalRetrievalChain instead I can use vectorstores and retrieve info from my docs but the chat quality is bad, it ignores my prompts like when I prompt it to impersonate a historical figure (it starts saying that it is an AI model after just some questions and that it can't impersonate).
Is there a way I can both have a chat that behaves exactly like onchat.openai.com but also can make use of local documents?
retriever = vectorstore.as_retriever(search_kwargs=dict(k=1))
memory = VectorStoreRetrieverMemory(retriever=retriever)
LLMChain(llm=llm, prompt=prompt, verbose=True, memory=memory)
retriever = vectorstore.as_retriever(search_kwargs=dict(k=1)) memory = VectorStoreRetrieverMemory(retriever=retriever) LLMChain(llm=llm, prompt=prompt, verbose=True, memory=memory)
Thank you! That solved it!
Hi, i have a requirement where i need to use Multi Prompt Chain to select a prompt out of 2 on the basis of given query and after selecting that particular prompt i need to extract the relevant document for that query from the vector data base. The query if to compare 2 report on the similar categories. With the help of Multi prompt Chain i am able to select a particular prompt but stuck at point how can we extract relevant documents for that query and use for Comparision.
I'd need something similar to that. Any ideas on how to use a vectorstore as an LLMChain to be part of a MultiPromptChain?
I am looking for the same thing only but didn't get any solution yet.
How do you use the above solution to indicate the retriever to replace a placeholder in the prompt?