GPTCache
GPTCache copied to clipboard
[Feature]: Streaming support
Is your feature request related to a problem? Please describe.
No response
Describe the solution you'd like.
No response
Describe an alternate solution.
No response
Anything else? (Additional Context)
No response
At present, GPTCache already supports the streaming request of chat complete in openai, can you describe the problems encountered in detail?
@SimFG is there a demo for streaming support?
It is consistent with openai's api usage, like:
from gptcache.adapter import openai
question = "calculate 1+1"
openai.ChatCompletion.create(
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": question},
],
stream=True,
)
@gbertb I hope the above response has adequately answered your question. If you have any further inquiries, please do not hesitate to provide feedback and ask. With that, I will now close this issue.