openai-python icon indicating copy to clipboard operation
openai-python copied to clipboard

Cache-control headers are not set when polling agent status

Open robbyt opened this issue 1 year ago • 1 comments

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • [X] This is an issue with the Python library

Describe the bug

When using client.beta.threads.runs.create_and_poll, I would expect the Cache-Control header to be set to no-cache, so that an upstream caching proxy does not store the agent response while polling the same API URL.

To Reproduce

  1. Setup a caching proxy, e.g., llm_proxy
$ git clone [email protected]:Proxati/llm_proxy.git
$ cd llm_proxy
  1. Start the proxy in cache mode
$ go run main.go cache --debug
  1. Use the Agent API to make a request
$ cd llm_proxy/examples/python
$ poetry run agent/agent.py
  1. The agent will poll status forever, because the cache will store the status response because the request doesn't have a Cache-Control header set.

Code snippets

thread = client.beta.threads.create()

message = client.beta.threads.messages.create(
    thread_id=thread.id,
    role="user",
    content="Can you help me? How does AI work?",
)

run = client.beta.threads.runs.create_and_poll(
    thread_id=thread.id,
    assistant_id=assistant.id,
    instructions="When using a caching proxy, you will never return a 'completed' status",
)


### OS

any

### Python version

any

### Library version

openai-1.42.0

robbyt avatar Aug 25 '24 03:08 robbyt

Thanks for reporting, in the meantime you can explicitly set the cache control yourself

run = client.beta.threads.runs.create_and_poll(
    thread_id=thread.id,
    assistant_id=assistant.id,
    instructions="When using a caching proxy, you will never return a 'completed' status",
    extra_headers={'Cache-Control': 'no-cache'},
)

RobertCraigie avatar Sep 06 '24 10:09 RobertCraigie