pandas-ai
pandas-ai copied to clipboard
Huggingface Interface Endpoints
Hi, I wanted to ask if it was possible to use Huggingface Interface Endpoints and if so where to set the token. Can you give me details?
Thanks
Hi @emanueleparini, sure, you can find some details in the documentation: https://docs.pandas-ai.com/en/latest/LLMs/llms/#huggingface-via-text-generation
Feel free to ask if you have any further questions!
Hi @gventuri thanks for your reply. I had seen the documentation, but it talks about a local server, I have on AWS with an API Token that I didn't understand where to set, there doesn't seem to be a variable to define. Thanks
@emanueleparini you can just change the url to your AWS instead of using a local server!
Yes I did, but I don't get any response, as if it doesn't intercept the text.
Unfortunately, I was not able to answer your question, because of the following error:
No code found in the response
@emanueleparini we simplified the prompt so that it is easier to understand for the non-SOTA models with version 1.5.4. Can you give it a try? Which LLM are you using?
@gventuri now work. We have tryed with Mistral / Zephyr and Falcon Thanks
@emanueleparini did you use any free endpoints and could share some code? I can't get it to work.
I do not know if I understood your point correctly @gventuri about using the AWS url instead. I got an error:
Unfortunately, I was not able to answer your question, because of the following error:
'error'
when I used the URL from the Amazon Sagemaker Endpoint summary. I created a Mistral Endpoint using the Amazon Sagemaker Endpoint and got this error upon running the code. Hope you can elaborate more. Thank you!
@kenrubio can you share the full logs? That would help for me to understand it better :D
Hi @gventuri,
Here is the verbose return:
2024-02-15 22:17:40 [INFO] Question: Which country has the highest gdp?
2024-02-15 22:17:41 [INFO] Running PandasAI with huggingface-text-generation LLM...
2024-02-15 22:17:41 [INFO] Prompt ID: 04b39979-de77-4cb9-846c-db6c251461ee
2024-02-15 22:17:41 [INFO] Executing Step 0: CacheLookup
2024-02-15 22:17:41 [INFO] Executing Step 1: PromptGeneration
2024-02-15 22:17:41 [INFO] Using prompt: <dataframe>
dfs[0]:10x3
country,gdp,happiness_index
France,4138730290,6.66
United States,9074050267,6.94
United Kingdom,5028720044,7.16
</dataframe>
Update this initial code:
python
# TODO: import the required dependencies
import pandas as pd
# Write code here
# Declare result var: type (possible values "string", "number", "dataframe", "plot").
Examples: { "type": "string", "value": f"The highest salary is {highest_salary}." } or { "type": "number", "value": 125 } or { "type": "dataframe", "value": pd.DataFrame({...}) } or { "type": "plot", "value": "temp_chart.png" }
Q: Which country has the highest gdp?
Variable `dfs: list[pd.DataFrame]` is already declared.
At the end, declare "result" variable as a dictionary of type and value.
Generate python code and return full updated code:
2024-02-15 22:17:41 [INFO] Executing Step 2: CodeGenerator
2024-02-15 22:17:42 [ERROR] Pipeline failed on step 2: 'error'
Unfortunately, I was not able to answer your question, because of the following error:
'error'
I was actually able to go around here by creating an inference URL that utilizes the Amazon Sagemaker Endpoint. I have a new and related issue here: https://github.com/Sinaptik-AI/pandas-ai/issues/936
The model keeps on giving an error message:
Unfortunately, I was not able to answer your question, because of the following error: Expecting value: line 1 column 1 (char 0)