pandas-ai
pandas-ai copied to clipboard
OpenAssistant Error
Hey Thanks for the package, When I tried using OpenAssistant. I got this error. The same code works fine with OpenAI.
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/IPython/core/interactiveshell.py", line 3553, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File ["<ipython-input-9-ac30c70b206d>"](https://localhost:8080/#), line 2, in <cell line: 2>
pandas_ai.run(df, prompt='Which are the 5 happiest countries?')
File "/usr/local/lib/python3.10/dist-packages/pandasai/__init__.py", line 70, in run
code = self._llm.generate_code(
File "/usr/local/lib/python3.10/dist-packages/pandasai/llm/base.py", line 76, in generate_code
return self._extract_code(self.call(instruction, prompt))
File "/usr/local/lib/python3.10/dist-packages/pandasai/llm/base.py", line 62, in _extract_code
code = self._polish_code(code)
File "/usr/local/lib/python3.10/dist-packages/pandasai/llm/base.py", line 45, in _polish_code
self._remove_imports(code)
File "/usr/local/lib/python3.10/dist-packages/pandasai/llm/base.py", line 24, in _remove_imports
tree = ast.parse(code)
File "/usr/lib/python3.10/ast.py", line 50, in parse
return compile(source, filename, mode, flags,
File "<unknown>", line 5
df =
^
SyntaxError: invalid syntax
@amrrs thanks a lot for reporting. I suspect it has to deal with the token limit. Unfortunately such models allow very low maximum of tokens. I'll try to investigate a little bit into it!
@gventuri Thanks for looking into it!
I have the same error:
Traceback (most recent call last):
File D:\Anaconda3\lib\site-packages\IPython\core\interactiveshell.py:3369 in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
Input In [14] in <cell line: 1>
pandas_ai.run(df, prompt='I want to buy a chair')
File D:\Anaconda3\lib\site-packages\pandasai\__init__.py:70 in run
code = self._llm.generate_code(
File D:\Anaconda3\lib\site-packages\pandasai\llm\base.py:76 in generate_code
return self._extract_code(self.call(instruction, prompt))
File D:\Anaconda3\lib\site-packages\pandasai\llm\base.py:62 in _extract_code
code = self._polish_code(code)
File D:\Anaconda3\lib\site-packages\pandasai\llm\base.py:45 in _polish_code
self._remove_imports(code)
File D:\Anaconda3\lib\site-packages\pandasai\llm\base.py:24 in _remove_imports
tree = ast.parse(code)
File D:\Anaconda3\lib\ast.py:50 in parse
return compile(source, filename, mode, flags,
File <unknown>:1
One possible solution is:
^
SyntaxError: invalid syntax
try to use this custom langchain LLM WRAPPER https://github.com/IntelligenzaArtificiale/Free-AUTO-GPT-with-NO-API/blob/main/FreeLLM/HuggingChatAPI.py
Ok there seems to be a limit in the tokens amount. We are working on possible workaround to overcome these issues and integrate more LLM apart from OpenAI! I'll keep you posted!
very good
Update: I've been working on addressing this issue, and fix some of the problems related to make OpenAssistant work. However, it seems it's not "smart enough" to understand the complex prompt we are passing and the follow the instruction. I'll investigate further to figure out whether improving the prompt produces a significantly better (and correct) output, otherwise we'll have to remove it from the supported LLMs.
In the meanwhile, I've added Starcoder LLM, you can try it out as an alternative, if you want!
Closing, as there are now open source LLMs performing better than open assistant and open assistant doesn't perform well enough
@gventuri Thanks for the update, anything that you'd like to recommend? I guess we have to use the langchain connector for this
@amrrs I would just recommend using the OpenAI api or if you want a free one, I would suggest either Falcon or Starcoder!
Thank you @gventuri