Add async calls to (Azure) OpenAI
Current Invoke does not allow for async calls to OpenAI api.
This is a blocker for adoption where non blocking use is needed (or unelegant hacks needs to be added on top).
The proposal is to add an ainvoke interfase, that works with the same inputs and resolves to the same outputs.
Good call - I need to do that.
Hi @sethjuarez, I see that it is already implemented in promptflow.core
In fact all Prompty implementation is more complete there. (Escaping, handling of tools in parser, streaming, token count)
Is there a plan to unify the base parsing?
Tool handling, streaming, and token count already work in prompty (if it reports tokens of course). I do agree that I need to add async invokers. Give me a couple of days to get it in (some folks are also working on json_schema enhancements).