[Feature Request] Promptflow can't save some node output when exception happens
Is your feature request related to a problem? Please describe.
At the moment when exception happens from run as a function, for example, content management policy exception triggered all nodes output are lost:
f = load_flow(source="../../examples/flows/chat/chat-basic/")
f.context.streaming = True
try:
result = f(
chat_history=[
{
"inputs": {"chat_input": "Hi"},
"outputs": {"chat_output": "Hello! How can I assist you today?"},
}
],
question="How are you?",
)
except WrappedOpenAIError as exc:
# we lost succeeded node output as well
pass
answer = ""
# the result will be a generator, iterate it to get the result
for r in result["answer"]:
answer += r
Describe the solution you'd like when exception happens can we have some nodes output saved somewhere? for example: node1->node2->node3 when node2 triggers exception we can still have output from node1?
Describe alternatives you've considered Not sure
Additional context Not sure
Hi @vhan2kpmg , we currently model flow-as-function feature like a function call. Which means when calling the flow function, no intermedia data will be persisted. We won't change its behavior in near future. I've added a long term tag to track this.