langchain icon indicating copy to clipboard operation
langchain copied to clipboard

Trying to pass custom prompt in load_qa_with_sources_chain results in error

Open momegas opened this issue 1 year ago • 6 comments

Running the code below produces the following error: document_variable_name summaries was not found in llm_chain input_variables: ['name'] (type=value_error)

Any ideas?

Code:

def use_prompt(self, template: str, variables=List[str], verbose: bool = False):
    prompt_template = PromptTemplate(
        template=template,
        input_variables=variables,
    )

    self.chain = load_qa_with_sources_chain(
        llm=self.llm,
        prompt=prompt_template,
        verbose=verbose,
    )
        
use_prompt(template="Only answer the question 'What is my name?' by replaying with only the name. My name is {name}", variables=["name"])

momegas avatar Apr 13 '23 23:04 momegas

That should work

It seems to be expecting {summaries} variable, try this, it should work

def use_prompt(template: str, variables=List[str], verbose: bool = False):
    prompt_template = PromptTemplate(
        template=template,
        input_variables=variables,
    )
    chain = load_qa_with_sources_chain(
        llm=llm,
        prompt=prompt_template,
        verbose=verbose,
    )
        
use_prompt(template="""Only answer the question 'What is my name?' by replaying with only the name. My name is {name}
    =========
    {summaries}
    =========
    Final Answer:""", variables=["summaries", "name"])

skeretna avatar Apr 17 '23 02:04 skeretna

I may really be doing something wrong. I get the error: ValueError: Missing some input keys: {'name'} with the code below. I pass exactly what you provided, right? Am i missing something? Could this be a bug?

from langchain import PromptTemplate
from langchain.chains.qa_with_sources import load_qa_with_sources_chain
from langchain.chat_models import ChatOpenAI
from dotenv import load_dotenv

load_dotenv()

def get_chain(template: str, variables, verbose: bool = False):
    llm = ChatOpenAI()
    
    prompt_template = PromptTemplate(
        template=template,
        input_variables=variables,
    )
    return load_qa_with_sources_chain(
        llm=llm,
        prompt=prompt_template,
        verbose=verbose,
    )
        
chain = get_chain(template="""Only answer the question 'What is my name?' by replaying with only the name. My name is {name}
    =========
    {summaries}
    =========
    Final Answer:""", variables=["summaries", "name"])

question = "test question?"
answer = chain.run(input_documents="", question=question)

momegas avatar Apr 17 '23 10:04 momegas

I am a bit confused about how you're defining "name" in the prompt, actually I don't exactly understand the prompt since you do not take the name as input anywhere, what do you want to do exactly?

This works for me:

from langchain import PromptTemplate
from langchain.chains.qa_with_sources import load_qa_with_sources_chain
from langchain.chat_models import ChatOpenAI
from dotenv import load_dotenv

load_dotenv()

def get_chain(template: str, variables, verbose: bool = False):
    llm = ChatOpenAI(engine=deployment_name)
    
    prompt_template = PromptTemplate(
        template=template,
        input_variables=variables,
    )
    return load_qa_with_sources_chain(
        llm=llm,
        prompt=prompt_template,
        verbose=verbose,
    )
        
chain = get_chain(template="""Only answer the question 'What is my name?' by replaying with only the name. My name is name
    =========
    {summaries}
    =========
    Final Answer:""", variables=["summaries"])

question = "test question?"
answer = chain.run(input_documents="", question=question)

print(answer)

skeretna avatar Apr 19 '23 00:04 skeretna

How would the screenshot u shared work if you're not passing the summaries variable when you call run?

amirgamil avatar May 22 '23 02:05 amirgamil

Another question about this. Why is {summaries} required at all? Shouldn't I be able to create a template that takes only the inputs I want it to take?

wmbutler avatar Jun 02 '23 17:06 wmbutler

I have the same question... why is {summaries} even required?

vibha0411 avatar Jul 07 '23 12:07 vibha0411

Is there an example of making it work with RetrievalQA ?

The code examples above do not work with RetrievalQA

aiquick avatar Jul 19 '23 20:07 aiquick

I have the same issue... other google search pointed out that i need to create a variable called "context" but i dont need it..

stepkurniawan avatar Sep 13 '23 13:09 stepkurniawan

Have there been any resolutions here? This issue is still persistent. I am getting the same error when I am passing a custom prompt purposefully with no variables associated with the f-string. Following the documentation directly from LangChain: https://python.langchain.com/docs/modules/model_io/prompts/prompt_templates/

Stephen-Strosko avatar Oct 18 '23 17:10 Stephen-Strosko

Maybe this helps:

messages = [
        SystemMessagePromptTemplate.from_template(promptTemplate),
        HumanMessagePromptTemplate.from_template("{question}")
        ]
qa_prompt = ChatPromptTemplate.from_messages(messages)

qa_chain = ConversationalRetrievalChain.from_llm(
      llm, retriever, memory=memory,get_chat_history=lambda h : h,combine_docs_chain_kwargs={"prompt": qa_prompt})

RubensZimbres avatar Dec 29 '23 04:12 RubensZimbres

I have the same issue... other google search pointed out that i need to create a variable called "context" but i dont need it..

Did you ever solve it? What did you do

bga41 avatar Mar 04 '24 06:03 bga41