Honglin
Honglin
Hi @bastbu, Thank you for your feedback. Since we are currently on the lunar new year holiday, it may take some time to formally investigate and fix the issue. We...
Hi @thomassantosh , I used mac to run the tutorial but did not repro the error. My environment is: OS: Ventura 13.1 azure-ai-ml=1.2.0 python=3.8.13 Which sdk version do you use?...
I can repro this error running the notebook you provided, and I even repro-ed in Windows as well, not sure about the root cause yet but I'll take a further...
@luigiw The referenced folder is created inside notebook, customer run this notebook in his own repo and the .gitignore file has excluded thoese snapshot files. @thomassantosh Looks like the .gitignore...
Hi @duongthaiha , thanks for the feedback, our team does not have the bandwidth now, I'll mark this as a long-term item and will pick it up once we have...
> However, if there is a node after llm node, stream result will be just final output, e.g. result["answer"] will be string. I thinks that's due to the python nature...
Hi @JacquesGariepy , From the trace stack it looks like to be a pretty old version of prompt flow, could you please try to upgrade to the latest version 1.12.0...
Hi @JacquesGariepy, I have some questions: 1. What's the repro steps of this error? Are you trying to create a new empty flow with vscode extension? I did the same...
Hi @KatoStevenMubiru Thank you for your contribution request. I have a few questions: 1. Could you please provide more details about the "Unify" you mentioned? Do you have any documentation...
@KatoStevenMubiru Thanks for the clarification. Could you please provide a pseudo-code example to demonstrate how users would utilize Promptflow along with the integrated Unify features? This would help us better...