promptflow icon indicating copy to clipboard operation
promptflow copied to clipboard

[BUG] Streaming with LLM node requires `stream: true` in `inputs` of LLM node in `flow.dag.yaml` for docker deploy. Annoyingly gets deleted whenever you run flow in vscode

Open journeyman-msft opened this issue 1 year ago • 2 comments

Describe the bug Streaming with an LLM node requires stream: true in inputs of LLM node in flow.dag.yaml . Annoyingly gets deleted whenever you run flow in vscode. So when you deploy to docker, make sure deploy/flow/flow.dag.yaml has it

Example below

  inputs:
    deployment_name: gpt-4-32k
    temperature: 0
    question: ${QueryRewriter.output}
    concat_chunks: ${FormatArticles.output}
    stream: true

How To Reproduce the bug Steps to reproduce the behavior, how frequent can you experience the bug:

  1. Choose a sample
  2. deploy to docker
  3. use your favorite HTTP call tool (Thunder client is good) with the streaming headers "Accept" : "text/event-stream, application/json"
  4. Notice the data does not come in as chunks
  5. Now using the same sample, modify flow.dag.yaml to have stream: true in the LLM node
  6. deploy to docker
  7. use your favorite HTTP call tool (Thunder client is good) with the streaming headers "Accept" : "text/event-stream, application/json"
  8. Notice the chunks

Bonus: if you set stream:true, run vscode's promptflow, then open the flow.dag.yaml again it disappears. Very hard for users to do this.

Expected behavior I expect a UI method in vscode to set stream = True or for it to not be deleted in flow.dag.yaml. Or perhaps the docs suggest that an LLM node should stream automatically.

Screenshots image

image

Running Information(please complete the following information):

  • Promptflow Package Version using pf -v: 1.6.0
  • Operating System: Docker serving via promptflow
  • Python Version using python --version: 3.9.16

Additional context

journeyman-msft avatar Mar 28 '24 19:03 journeyman-msft

The latest version of the promptflow vscode extension and with promptflow==1.7.0 now doesn't delete stream: true when a user runs a flow, but it still does not enable streaming by default for an LLM node as the documentation says.

journeyman-msft avatar Apr 25 '24 18:04 journeyman-msft

Hi @journeyman-msft , did you deploy your flow following this doc? https://microsoft.github.io/promptflow/how-to-guides/deploy-a-flow/deploy-using-docker.html + @elliotzh to help with the deploy issue.

D-W- avatar Apr 28 '24 03:04 D-W-

Hi, we're sending this friendly reminder because we haven't heard back from you in 30 days. We need more information about this issue to help address it. Please be sure to give us your input. If we don't hear back from you within 7 days of this comment, the issue will be automatically closed. Thank you!

github-actions[bot] avatar May 28 '24 21:05 github-actions[bot]