openai-python
openai-python copied to clipboard
Cache-control headers are not set when polling agent status
Confirm this is an issue with the Python library and not an underlying OpenAI API
- [X] This is an issue with the Python library
Describe the bug
When using client.beta.threads.runs.create_and_poll, I would expect the Cache-Control header to be set to no-cache, so that an upstream caching proxy does not store the agent response while polling the same API URL.
To Reproduce
- Setup a caching proxy, e.g., llm_proxy
$ git clone [email protected]:Proxati/llm_proxy.git
$ cd llm_proxy
- Start the proxy in cache mode
$ go run main.go cache --debug
- Use the Agent API to make a request
$ cd llm_proxy/examples/python
$ poetry run agent/agent.py
- The agent will poll status forever, because the cache will store the status response because the request doesn't have a
Cache-Controlheader set.
Code snippets
thread = client.beta.threads.create()
message = client.beta.threads.messages.create(
thread_id=thread.id,
role="user",
content="Can you help me? How does AI work?",
)
run = client.beta.threads.runs.create_and_poll(
thread_id=thread.id,
assistant_id=assistant.id,
instructions="When using a caching proxy, you will never return a 'completed' status",
)
### OS
any
### Python version
any
### Library version
openai-1.42.0
Thanks for reporting, in the meantime you can explicitly set the cache control yourself
run = client.beta.threads.runs.create_and_poll(
thread_id=thread.id,
assistant_id=assistant.id,
instructions="When using a caching proxy, you will never return a 'completed' status",
extra_headers={'Cache-Control': 'no-cache'},
)