langchain icon indicating copy to clipboard operation
langchain copied to clipboard

DOC: SQL Chain Example - Customise Prompt

Open Data-drone opened this issue 1 year ago • 15 comments

Issue with current documentation:

https://python.langchain.com/en/latest/modules/chains/examples/sqlite.html#customize-prompt

When looking at the Cutomize prompt example, the subsequent db_chain.run() command is just like the pre-prompt chain.

It is currently like:

db_chain.run("How many employees are there in the foobar table?")

Shouldn't it be something like:

db_chain.run({'input': "How many employees are there in the foobar table?", 'table':'foobar', 'dialect':'testing'})

Since we added the prompt to the db_chain

Idea or request for content:

No response

Data-drone avatar May 15 '23 02:05 Data-drone

SQLDatabaseChain figure those parameters out for you, thats why they are not expected to be provided. It also does a bunch of other things like telling llm where to stop using the stop argument.

If you want to directly provide these arguments to the LLMs via the prompt. You would do something like this:

from langchain.prompts.prompt import PromptTemplate
from langchain.chains import LLMChain

_DEFAULT_TEMPLATE = """Given an input question, first create a syntactically correct {dialect} query to run, then look at the results of the query and return the answer.
Use the following format:

Question: "Question here"
SQLQuery: "SQL Query to run"
SQLResult: "Result of the SQLQuery"
Answer: "Final answer here"

Only use the following tables:

{table_info}

If someone asks for the table foobar, they really mean the employee table.

Question: {input}"""

PROMPT = PromptTemplate(
    input_variables=["input", "table_info", "dialect"], template=_DEFAULT_TEMPLATE
)

db = SQLDatabase.from_uri("sqlite:///movie.db")
chain = LLMChain(llm= openai_llm, prompt=PROMPT)
chain.run(dict(input="Who is the oldest employee?", table_info=db.get_table_info(), dialect=db.dialect))

imeckr avatar May 15 '23 11:05 imeckr

Can I add additional input variables besides those in the _DEFAULT_TEMPLATE? I want to add chat history to the template but I keep getting ValueError( missing input keys: {'chat_history'}.

pressembrero avatar May 22 '23 09:05 pressembrero

You can add variables but you need to provide them alongside other inputs to the chain. You get the missing input keys errors if you don't provide variables defined in the template.

In case you are still facing issue, please share the code snippet.

imeckr avatar May 22 '23 09:05 imeckr

I

You can add variables but you need to provide them alongside other inputs to the chain. You get the missing input keys errors if you don't provide variables defined in the template.

In case you are still facing issue, please share the code snippet.

I am trying to give additional context to the prompt template by adding chat history from Discord. But I am getting the ValueError( missing input keys: {'chat_history'} even if I declared the chat_history already.

This is how I want to create my prompt:

_DEFAULT_TEMPLATE  = """
        #Instructions
        You are a chatbot on Discord. 
        Respond to the following conversation in the most helpful way possible.
        Given an input question, first create a syntactically correct {dialect} query to run, then look at the results of the query and return the answer.
        Use the following format:

            Chat History: '{chat_history}'
            Question: "Question here"
            SQLQuery: "SQL Query to run"
            SQLResult: "Result of the SQLQuery" 
            Answer: "Final answer here"

        Only use the following tables:

        {table_info}
        
        Use the table employee with id as primary key to get the list of employees.
        Use the table attendance with id as the primary key to get the attendance.
        Use the table position with id as the primary key to get the position. 

        
        Question: {input}
        """
        
        PROMPT = PromptTemplate(
            input_variables=['dialect', 'input',  'table_info', 'chat_history'], template=_DEFAULT_TEMPLATE, validate_template=False    
            
         db_chain = SQLDatabaseChain.from_llm(llm=llm, db = sql_database, prompt=PROMPT,  verbose=True, top_k=10)
        )

pressembrero avatar May 22 '23 10:05 pressembrero

Are you passing the chat_history variable in chain.run ?

imeckr avatar May 22 '23 11:05 imeckr

Are you passing the chat_history variable in chain.run ?

Yep : db_chain = SQLDatabaseChain.from_llm(llm=llm, db = sql_database, prompt=PROMPT, verbose=True, top_k=10) response = db_chain.run(chat_history)

pressembrero avatar May 22 '23 12:05 pressembrero

I also tried: db_chain = SQLDatabaseChain.from_llm(llm=llm, db = sql_database, prompt=PROMPT, verbose=True, top_k=10) response = db_chain(message.content, chat_history)

pressembrero avatar May 22 '23 12:05 pressembrero

SQLDatabaseChain figure those parameters out for you, thats why they are not expected to be provided. It also does a bunch of other things like telling llm where to stop using the stop argument.

If you want to directly provide these arguments to the LLMs via the prompt. You would do something like this:

from langchain.prompts.prompt import PromptTemplate
from langchain.chains import LLMChain

_DEFAULT_TEMPLATE = """Given an input question, first create a syntactically correct {dialect} query to run, then look at the results of the query and return the answer.
Use the following format:

Question: "Question here"
SQLQuery: "SQL Query to run"
SQLResult: "Result of the SQLQuery"
Answer: "Final answer here"

Only use the following tables:

{table_info}

If someone asks for the table foobar, they really mean the employee table.

Question: {input}"""

PROMPT = PromptTemplate(
    input_variables=["input", "table_info", "dialect"], template=_DEFAULT_TEMPLATE
)

db = SQLDatabase.from_uri("sqlite:///movie.db")
chain = LLMChain(llm= openai_llm, prompt=PROMPT)
chain.run(dict(input="Who is the oldest employee?", table_info=db.get_table_info(), dialect=db.dialect))

How to implement Conversation buffer memory to this

charanhu avatar May 26 '23 13:05 charanhu

This example shows how to add ConvesationBuffer to LLMChains https://python.langchain.com/en/latest/modules/memory/types/buffer.html

imeckr avatar May 26 '23 18:05 imeckr

I also tried: db_chain = SQLDatabaseChain.from_llm(llm=llm, db = sql_database, prompt=PROMPT, verbose=True, top_k=10) response = db_chain(message.content, chat_history)

Is the issue fixed as I am also facing similar issue.

AmlanSamanta avatar May 30 '23 07:05 AmlanSamanta

I am also blocked by this issue.

MatejBabis avatar Jun 22 '23 12:06 MatejBabis

i am also unable to retain chat history in case of SQLDatabaseChain i get error missing parameter when ever i try to run chain with 2 parameters query, chat_history instead of 1

sanasz91mdev avatar Jun 27 '23 05:06 sanasz91mdev

I have the similar question on how to implement chat hisotry in SQLDatabaseChain.from_llm. Thanks!

Soul-AI-LLM avatar Jul 09 '23 09:07 Soul-AI-LLM

is there any updated information?

dtthanh1971 avatar Jul 10 '23 04:07 dtthanh1971

Noticed in the sql database base.py, there is no history defined there. Could we add in the history for SQL Database?

def _call(
    self,
    inputs: Dict[str, Any],
    run_manager: Optional[CallbackManagerForChainRun] = None,
) -> Dict[str, Any]:
    _run_manager = run_manager or CallbackManagerForChainRun.get_noop_manager()
    input_text = f"{inputs[self.input_key]}\nSQLQuery:"
    _run_manager.on_text(input_text, verbose=self.verbose)
    # If not present, then defaults to None which is all tables.
    table_names_to_use = inputs.get("table_names_to_use")
    table_info = self.database.get_table_info(table_names=table_names_to_use)
    llm_inputs = {
        "input": input_text,
        "top_k": str(self.top_k),
        "dialect": self.database.dialect,
        "table_info": table_info,
        "stop": ["\nSQLResult:"],
    }

Soul-AI-LLM avatar Jul 10 '23 07:07 Soul-AI-LLM

How can we implement this in Multiprompt chain.

Boopalanoptisol avatar Jul 13 '23 07:07 Boopalanoptisol

Hi, @Data-drone! I'm Dosu, and I'm here to help the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.

From what I understand, the issue you raised is about a documentation error in the SQL Chain Example. You suggested that the db_chain.run() command should include additional parameters such as 'table' and 'dialect' since the prompt was added to the db_chain. There has been a discussion in the comments about adding additional input variables to the template and implementing conversation buffer memory in the chain. Some users have also reported issues with retaining chat history in the SQLDatabaseChain.

Before we close this issue, we wanted to check with you if it is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.

Thank you for your contribution to the LangChain project!

dosubot[bot] avatar Oct 12 '23 16:10 dosubot[bot]