llm-app icon indicating copy to clipboard operation
llm-app copied to clipboard

[FEATURE] OpenAI Chat Completion response streaming support

Open Boburmirzo opened this issue 1 year ago • 1 comments

I assume that currently, the LLM App API client wrapper for OpenAI API does not support this streaming completions feature.

It is nice to have it where we can stream ChatGPT final responses into Pathway's output connectors such as Kafka, Redpanda or Debezium.

References:

https://github.com/openai/openai-cookbook/blob/main/examples/How_to_stream_completions.ipynb https://platform.openai.com/docs/api-reference/completions/create#completions/create-stream

Boburmirzo avatar Aug 08 '23 09:08 Boburmirzo

Can I work on this? Please assign this issue to me

mihir1739 avatar Oct 21 '23 13:10 mihir1739