langchain
langchain copied to clipboard
Trying to pass custom prompt in load_qa_with_sources_chain results in error
Running the code below produces the following error: document_variable_name summaries was not found in llm_chain input_variables: ['name'] (type=value_error)
Any ideas?
Code:
def use_prompt(self, template: str, variables=List[str], verbose: bool = False):
prompt_template = PromptTemplate(
template=template,
input_variables=variables,
)
self.chain = load_qa_with_sources_chain(
llm=self.llm,
prompt=prompt_template,
verbose=verbose,
)
use_prompt(template="Only answer the question 'What is my name?' by replaying with only the name. My name is {name}", variables=["name"])
That should work
It seems to be expecting {summaries} variable, try this, it should work
def use_prompt(template: str, variables=List[str], verbose: bool = False):
prompt_template = PromptTemplate(
template=template,
input_variables=variables,
)
chain = load_qa_with_sources_chain(
llm=llm,
prompt=prompt_template,
verbose=verbose,
)
use_prompt(template="""Only answer the question 'What is my name?' by replaying with only the name. My name is {name}
=========
{summaries}
=========
Final Answer:""", variables=["summaries", "name"])
I may really be doing something wrong. I get the error: ValueError: Missing some input keys: {'name'}
with the code below. I pass exactly what you provided, right? Am i missing something? Could this be a bug?
from langchain import PromptTemplate
from langchain.chains.qa_with_sources import load_qa_with_sources_chain
from langchain.chat_models import ChatOpenAI
from dotenv import load_dotenv
load_dotenv()
def get_chain(template: str, variables, verbose: bool = False):
llm = ChatOpenAI()
prompt_template = PromptTemplate(
template=template,
input_variables=variables,
)
return load_qa_with_sources_chain(
llm=llm,
prompt=prompt_template,
verbose=verbose,
)
chain = get_chain(template="""Only answer the question 'What is my name?' by replaying with only the name. My name is {name}
=========
{summaries}
=========
Final Answer:""", variables=["summaries", "name"])
question = "test question?"
answer = chain.run(input_documents="", question=question)
I am a bit confused about how you're defining "name" in the prompt, actually I don't exactly understand the prompt since you do not take the name as input anywhere, what do you want to do exactly?
This works for me:
from langchain import PromptTemplate
from langchain.chains.qa_with_sources import load_qa_with_sources_chain
from langchain.chat_models import ChatOpenAI
from dotenv import load_dotenv
load_dotenv()
def get_chain(template: str, variables, verbose: bool = False):
llm = ChatOpenAI(engine=deployment_name)
prompt_template = PromptTemplate(
template=template,
input_variables=variables,
)
return load_qa_with_sources_chain(
llm=llm,
prompt=prompt_template,
verbose=verbose,
)
chain = get_chain(template="""Only answer the question 'What is my name?' by replaying with only the name. My name is name
=========
{summaries}
=========
Final Answer:""", variables=["summaries"])
question = "test question?"
answer = chain.run(input_documents="", question=question)
print(answer)
How would the screenshot u shared work if you're not passing the summaries
variable when you call run
?
Another question about this. Why is {summaries}
required at all? Shouldn't I be able to create a template that takes only the inputs I want it to take?
I have the same question... why is {summaries} even required?
Is there an example of making it work with RetrievalQA
?
The code examples above do not work with RetrievalQA
I have the same issue... other google search pointed out that i need to create a variable called "context" but i dont need it..
Have there been any resolutions here? This issue is still persistent. I am getting the same error when I am passing a custom prompt purposefully with no variables associated with the f-string. Following the documentation directly from LangChain: https://python.langchain.com/docs/modules/model_io/prompts/prompt_templates/
Maybe this helps:
messages = [
SystemMessagePromptTemplate.from_template(promptTemplate),
HumanMessagePromptTemplate.from_template("{question}")
]
qa_prompt = ChatPromptTemplate.from_messages(messages)
qa_chain = ConversationalRetrievalChain.from_llm(
llm, retriever, memory=memory,get_chat_history=lambda h : h,combine_docs_chain_kwargs={"prompt": qa_prompt})
I have the same issue... other google search pointed out that i need to create a variable called "context" but i dont need it..
Did you ever solve it? What did you do