langchain
langchain copied to clipboard
ChatOpenai (gpt3-turbo) isn't compatible with create_pandas_dataframe_agent, create_csv_agent etc
I tried creating a pandas dataframe agent (using create_dataframe_agent) with ChatOpenai (gpt3-turbo) as the LLM! But langchain isn't able to parse the LLM's output code. Ofcoure when I use davince model it works
This is the code:
from langchain.llms import OpenAIChat openaichat = OpenAIChat(model_name="gpt-3.5-turbo") agent = create_csv_agent(openaichat, 'fishfry-locations.csv', verbose=True) x = agent.run("How many rows for church?")
This is the output and error
Entering new AgentExecutor chain... Thought: We need to filter the dataframe to only include rows where the venue_type is "Church" and then count the number of rows. Action: python_repl_ast Action Input:
len(df[df['venue_type'] == 'Church'])
Observation: invalid syntax (
len(df[df['venue_type'] == 'Church'])
Observation: invalid syntax (
Finished chain.
I have seen lots of parsing errors with gpt-3.5-turbo and gpt-4 as well.
ditto with the pandas agent. often the syntax is actually correct.
Hi, @Architectshwet! I'm Dosu, and I'm here to help the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.
From what I understand, you reported an issue regarding the compatibility of the ChatOpenai (gpt3-turbo) model with certain agent creation functions like create_pandas_dataframe_agent and create_csv_agent. It seems that the langchain module is unable to parse the output code from the LLM. engma-linguistics and tevslin have also encountered parsing errors with gpt-3.5-turbo and gpt-4, and tevslin mentioned that the syntax is often correct.
Before we proceed, we would like to confirm if this issue is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on this issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.
Thank you for your understanding and contribution to the LangChain project. We appreciate your support!