openai-python
openai-python copied to clipboard
Provide a callback whenever retry is triggered
Confirm this is a feature request for the Python library and not the underlying OpenAI API.
- [X] This is a feature request for the Python library
Describe the feature or improvement you're requesting
Is it possible to provide a callback whenever a retry is triggered internally, so that we can know when and how the requests failed?
Additional context
We want to give our users some insights if some OpenAI requests fail with rate limit error
Thanks for the suggestion – I don't think we have a great way to do this right now, but it seems like a reasonable idea. I don't expect to get to it soon, but will keep the issue open.
Can we just pass in a callback like this
def error_callback(exception: Exception):
# custom logic
completion = await openai_client.chat.completions.create(
model=model, ..., error_callback=error_callback
)
And here do
except Exception as err:
error_callback(err)
I doubt that's how we'd design it. You could use httpx event_hooks: https://www.python-httpx.org/advanced/event-hooks/
I doubt that's how we'd design it. You could use httpx event_hooks: https://www.python-httpx.org/advanced/event-hooks/
Ah I didn't know we can use custom httpx Client. This works for us, thanks!
Great! It'd be appreciated if you could share the (rough) code snippet you and up using, so others can benefit
Great! It'd be appreciated if you could share the (rough) code snippet you and up using, so others can benefit
Sure, this block works for me
from openai import OpenAI, timeout
from openai._base_client import SyncHttpxClientWrapper
def _print_response_status_code(response):
print("Response status code", response.status_code)
http_client = SyncHttpxClientWrapper(
base_url="https://api.openai.com/v1",
timeout=timeout,
follow_redirects=True,
event_hooks={"response": [_print_response_status_code]},
)
client = OpenAI(http_client=http_client)
response = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Who won the world series in 2020?"},
{"role": "assistant", "content": "The Los Angeles Dodgers won the World Series in 2020."},
{"role": "user", "content": "Where was it played?"}
]
)
It's still a bit annoying to hack to import SyncHttpxClientWrapper, and manually set all the defaults(like base_url, timeout, follow_redirects, etc) for its constructor tho. If it's possible to expose SyncHttpxClientWrapper with all default arguments, it would make life much easier
@rattrayalex : +1 on this request... Would you be open to a contribution to pass along event_hooks
from the OpenAI
client initialization to the constructor for the client?
We're really to avoid having to redefine the http_client
and leverage what OpenAI provides out of the box.
It's still a bit annoying to hack to import SyncHttpxClientWrapper, and manually set all the defaults(like base_url, timeout, follow_redirects, etc) for its constructor tho. If it's possible to expose SyncHttpxClientWrapper with all default arguments, it would make life much easier
That's a good point – it'd be nice for us to expose the default args at least, if not a class which has those defaults. I don't think you should really need to import SyncHttpxClientWrapper
instead of just httpx.Client
though.
Would you be open to a contribution to pass along event_hooks from the OpenAI client initialization to the constructor for the client?
I don't think we'd want it to be event_hooks
but something closer to on_request=
. We have to gavel on the design there internally first, so I don't think a contribution at this time would be a wise investment. Thank you though!
cc @RobertCraigie on the above – we should get this slated.
Yes you definitely should not have to use SyncHttpxClientWrapper
, thats an internal only thing that we use for reference counting.
It might be a good idea to provide something like this though, which would use our default options, thoughts?
import openai
client = openai.OpenAI(
http_client=openai.DefaultHttpxClient(...),
)
@rattrayalex @RobertCraigie DefaultHttpxClient
looks good to me! Can you help expose that for us? Really appreciate it!
I really like the idea of having a DefaultHttpxClient
that users can use without needing to redo setting the default properties, with other configurable options that are currently available on httpx
's client.
I'm having a bit of a hard time connecting the dots to go from on_request
handlers to something akin to event_hooks
especially because what I'm trying to get at is understanding not just when a request occurred, but when a 4xx response / retry occurred. So, that necessitates skipping maybe the first request and then catching the other ones.
I'm probably just missing some connection which is likely obvious to the maintainers here, so that's totally fine, just saying out loud that it'd be nice to get some docs on the usage here when this gets worked on.
Appreciate the quick feedback, iterations on design and openness to implementation here a bunch, OpenAI team!
You're correct it isn't trivial, which is why we will need to take our time designing it – and on_request
and similar niceities may not land right away.
However, DefaultHttpxClient
is more straightforward and should be able to happen sooner.
We plan to do both!
Great to hear that there's potential for both of those approaches to be supported natively. Thanks for adding this to the roadmap and proactively engaging here!
@rattrayalex @RobertCraigie Thanks! Is there any ETA for the straightforward DefaultHttpxClient
? This is blocking us on an important project, so I really appreciate if it can be fixed asap.
OOC how is this blocking you? That class would just be changing the defaults and you can change the defaults yourself as well, e.g. this will be functionally exactly the same as the solution we provide
import openai
import httpx
openai = openai.OpenAI(
http_client=httpx.Client(
timeout=openai.DEFAULT_TIMEOUT,
follow_redirects=True,
limits=httpx.Limits(max_connections=100, max_keepalive_connections=20), # also currently the httpx default
)
)
@RobertCraigie Yeah this is not a blocker. We can move forward with this, but please keep us posted if the formal solution is out. Thanks!