pandas-ai icon indicating copy to clipboard operation
pandas-ai copied to clipboard

Graph/Plot Continue Execution Feature

Open JeffreyLind3 opened this issue 1 year ago • 4 comments

Right now with a prompt asking to show a graph, pandasAI returns the plt.show() function, thus displaying the graph and blocking code execution until the graph is closed.

Funny enough, I was able to force the LLM to return plt.show(block = false) with the following prompt: Make a bar chart of the order qty for all unique part id's. After showing the chart, don't block the process. Continue with plt.show(block=false).

(and it worked)

I'm not sure if this will be a pandasAI code change or some kind of prompt concatenation trick, but allowing for continuing code execution be a very useful feature to have at hand. Great project for the record 👍🏼

JeffreyLind3 avatar May 05 '23 14:05 JeffreyLind3

Here is a quick workaround by doing the prompt concatenation trick

print("\nSelect Your Prompt Type:")
print("1. Text")
print("2. Graph")

promptType = int(input("\nEnter your choice (1 or 2): "))

dataPrompt = input("\nEnter your prompt: ")

if promptType == 2:

    graphContinueExecutionText = " Use plt.show(block = false)"
    dataPrompt += graphContinueExecutionText

JeffreyLind3 avatar May 05 '23 14:05 JeffreyLind3

Hey @JeffreyLind22, thanks a lot for sharing. We'll investigate into it soon! Can you provide the code that makes PandasAI fail, including the prompt?

gventuri avatar May 05 '23 17:05 gventuri

Yes, here is my CSVAIPrompt function, based off of the from_csv.py example. The two Test Graphing Prompts are at the bottom of the code block.

Any graphing prompt that is in the format string(prompt += plt.show(block = false)) forces the response to use plt.show(block = false). This then allows program execution to continue while still showing the graph (CSVAIPrompt is mentioned in a while statement elsewhere, and should defer back to that after outputting the graph).

Put simply: responseContains(plt.show(block = false)) if, and only if, promptContains(plt.show(block = false))

Any prompt that does not specify plt.show(block = false) does not force the response to use plt.show(block = false), thus halting program execution until the graph window is closed.

Again, the inverse: responseDoesNotContain(plt.show(block = false)) if, and only if, promptDoesNotContain(plt.show(block = false))

I am not sure of the intended behavior of opening a graph + program execution flow, but an option would be useful.

def CSVAIPrompt(usingAI):

    if usingAI:
    
        while True:
        
            dataSource = input("\nEnter file name and extension (File must be inside of the PlastiGPT folder): ")
            
            if os.path.isfile(dataSource):
            
                if os.path.isfile("filename.csv"):
                
                    os.remove("filename.csv")
                    
                shutil.copyfile(dataSource, "filename.csv")
                                    
                break
                
            else:
            
                print("\nThe file name you entered doesn't exist. Be sure to include the file extension")
                
        print("\nSelect Your Prompt Type:")
        print("1. Text")
        print("2. Graph")
        
        while True:
        
            try:
                
                promptType = int(input("\nEnter your choice (1 or 2): "))
                
                if promptType not in [1, 2]:
            
                    print("\nInvalid choice. Please choose (1 or 2)")
                    
                else:
                
                    break
                                        
            except ValueError:
            
                print("\nInvalid input. Please enter a number")
                
        dataPrompt = input("\nEnter your prompt: ")
        
        if promptType == 2:
        
            graphContinueExecutionText = " Use plt.show(block = false)"
            dataPrompt += graphContinueExecutionText

        print("\n")

        # ChatGPT Prompt Starts Here
        llm = OpenAI(api_token = os.environ["OPENAI_API_KEY"])

        pandasAI = PandasAI(llm, verbose = True)
        
        pandasDataFrame = pandas.read_csv("filename.csv")
        
        response = pandasAI.run(pandasDataFrame, prompt = dataPrompt)
            
    else:
    
        print("\nAI code is disabled for testing purposes")
        
    # Test Text Prompt: What is the average order qty for part id 1020-19?
    
    # Test Graphing Prompt: Make a bar chart of the order qty for all unique part id's.
    # Test Graphing Prompt 2: Make a bar chart of the order qty for all desired ship dates.

JeffreyLind3 avatar May 05 '23 19:05 JeffreyLind3

We can easily replace any calls to plt.show() with plt.show(block=false) would that give us the desired behaviour?

jonbiemond avatar Jun 01 '23 16:06 jonbiemond

I think adding plt.close('all') should do the trick right? @jonbiemond

If so, I guess I can close it!

gventuri avatar Jun 20 '23 14:06 gventuri

I think adding plt.close('all') should do the trick right? @jonbiemond

I fear not, I believe the code stops at plt.show() so it doesn't reach plt.close('all') until the popup is closed by the user. I haven't seen consistent behaviour here, my theory is that some IDEs like Jupyter and PyCharm actually intercept the popups.

jonbiemond avatar Jun 20 '23 14:06 jonbiemond

I am facing issues in Google Colab, PyCharm both

amjadraza avatar Jun 26 '23 13:06 amjadraza

I have a solution here. I'm using plt.ion() before pandas_ai.run(...), to make it interactive mode. With this, the execution won't wait at plt.show() and it will execute plt.close('all') as well. Hope this helps

vpurandara avatar Jun 27 '23 06:06 vpurandara

@vpurandara thanks a lot for sharing. Both plt.ion() and @jonbiemond's block=False solution did the trick!

gventuri avatar Jun 27 '23 23:06 gventuri

thanks guys

amjadraza avatar Jun 28 '23 00:06 amjadraza

@gventuri Why not make this configurable? I have a case where it's running in console but I don't want it to block. And prompting the LLM is not consistent.

Falven avatar Feb 10 '24 07:02 Falven