langchain icon indicating copy to clipboard operation
langchain copied to clipboard

Issue with using LlamaCpp LLM in Pandas Dataframe Agent

Open serena-mafinancial opened this issue 1 year ago • 1 comments

I am trying to use the Pandas Agent create_pandas_dataframe_agent, but instead of using OpenAI I am replacing the LLM with LlamaCpp. I am running this in Python 3.9 on a SageMaker notebook, with a ml.g4dn.xlarge instance size. I am having trouble with running this agent and produces a weird error.

The code is as follows: image

This is the error log: image image image

Detailed error log below: image image image image image image

serena-mafinancial avatar Apr 26 '23 02:04 serena-mafinancial

Update: I added n_ctx=1024 in LlamaCpp to load Vicuna model and got a new error. image

Error: image image

serena-mafinancial avatar Apr 26 '23 06:04 serena-mafinancial

Have you got any solutions?

Aditya701 avatar Jul 16 '23 17:07 Aditya701

I'm facing the same issue here

Ralphchahwan avatar Jul 21 '23 23:07 Ralphchahwan

Just some general observations I've seen the pandas agent is that it doesn't work well with models other than chatgpt. For your could not parse LLM error. Try passing in this output parser:

from langchain.agents.mrkl.output_parser import MRKLOutputParser

agent = create_pandas_dataframe_agent(llm, 
                                      df, 
                                      verbose=True,
                                      output_parser=MRKLOutputParser())

This should help with parsing the output to be fed to the agent. However, I've noticed the outputs generated by models other than chatgpt doesn't really work well.

Hope this helps

kvnsng avatar Jul 28 '23 14:07 kvnsng

Entering new AgentExecutor chain...

ValueError Traceback (most recent call last) in <cell line: 2>() 1 # results = agent("how many records are there? Give me rows and columns") ----> 2 agent1.run("what is the shape of dataframe")

18 frames /usr/local/lib/python3.10/dist-packages/langchain/llms/llamacpp.py in _get_parameters(self, stop) 249 # Raise error if stop sequences are in both input and default params 250 if self.stop and stop is not None: --> 251 raise ValueError("stop found in both the input and default params.") 252 253 params = self._default_params

ValueError: stop found in both the input and default params.

deepthi97midasala avatar Aug 31 '23 15:08 deepthi97midasala

i get this error i am using pandas data frame

deepthi97midasala avatar Aug 31 '23 15:08 deepthi97midasala

Entering new AgentExecutor chain...


ValueError Traceback (most recent call last) in <cell line: 1>() ----> 1 agent.run(" What are the applications where high volume of tickets are reported in the last 6 months?")

18 frames /usr/local/lib/python3.10/dist-packages/llama_cpp/llama.py in _create_completion(self, prompt, suffix, max_tokens, temperature, top_p, logprobs, echo, stop, frequency_penalty, presence_penalty, repeat_penalty, top_k, stream, tfs_z, mirostat_mode, mirostat_tau, mirostat_eta, model, stopping_criteria, logits_processor, grammar) 898 899 if len(prompt_tokens) >= llama_cpp.llama_n_ctx(self.ctx): --> 900 raise ValueError( 901 f"Requested tokens ({len(prompt_tokens)}) exceed context window of {llama_cpp.llama_n_ctx(self.ctx)}" 902 )

ValueError: Requested tokens (9064) exceed context window of 4096

deepthi97midasala avatar Sep 05 '23 09:09 deepthi97midasala

getting this i have dataframe with 2047 records i have given prompt as agent.run(" What are the applications where high volume of tickets are reported in the last 6 months?") please help me

deepthi97midasala avatar Sep 05 '23 09:09 deepthi97midasala

Hi, @serena-mafinancial! I'm helping the LangChain team manage their backlog and am marking this issue as stale.

It looks like you're encountering an error when using the Pandas Agent create_pandas_dataframe_agent with LlamaCpp instead of OpenAI in Python 3.9 on a SageMaker notebook. There have been some comments from other users sharing similar issues and providing potential solutions, as well as encountering a ValueError related to the context window of the LlamaCpp model.

Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days. Thank you!

dosubot[bot] avatar Dec 06 '23 17:12 dosubot[bot]