llm-app
llm-app copied to clipboard
[FEATURE] OpenAI Chat Completion response streaming support
I assume that currently, the LLM App API client wrapper for OpenAI API does not support this streaming completions feature.
It is nice to have it where we can stream ChatGPT final responses into Pathway's output connectors such as Kafka, Redpanda or Debezium.
References:
https://github.com/openai/openai-cookbook/blob/main/examples/How_to_stream_completions.ipynb https://platform.openai.com/docs/api-reference/completions/create#completions/create-stream
Can I work on this? Please assign this issue to me