async-openai
async-openai copied to clipboard
Rust library for OpenAI
Hey, we @basetenlabs / @nvidia are using your crate to run validations and output return. Two things are missing in the Rust crate over e.g. the Python types. - https://github.com/openai/openai-python/blob/4ada66f8f86473f342aa032ed021b62180422dc1/src/openai/types/shared/response_format_text_grammar.py#L10...
- I used ByteString which makes sure it displays json in string, when you enable tracing logs. - Deserialization errors are loaded with data, so upstream can handle them serving...
I think this: https://github.com/64bit/async-openai/blob/b26346d3efaeb9861259cd0f2e9c8b8f87793148/examples/tool-call-stream/src/main.rs#L82-L86 and this: https://github.com/64bit/async-openai/blob/b26346d3efaeb9861259cd0f2e9c8b8f87793148/examples/tool-call-stream/src/main.rs#L90-L96 store the arguments to a tool call twice. The later call to parse https://github.com/64bit/async-openai/blob/b26346d3efaeb9861259cd0f2e9c8b8f87793148/examples/tool-call-stream/src/main.rs#L225 fails: ``` thread 'tokio-runtime-worker' panicked at src/bin/tool-call-stream.rs:243:57: called `Result::unwrap()`...
Hi, I was wondering if I could use this wonderful project with audio_url support like below: https://docs.vllm.ai/en/v0.6.0/getting_started/examples/openai_audio_api_client.html Thanks!
hello trying out gpt-5-nano, with .max_completion_token, and if if surpass the limit, the content if empty: `choices[0].message.content.is_empty() == true`. it wasn't happening with .max_token, the output was cut at the...
## Summary This PR introduces an `HttpClient` trait abstraction that allows users to provide custom HTTP client implementations, enabling middleware support for automatic instrumentation, logging, retry logic, and more. ##...
I have questions regarding the tests, OPENAI_API_KEY is set to `"test"` and causing most of the tests to fail. Is this intended? I wish we had an option to test...
Went into the rabbit hole of streaming backoffs. Turns out eventsource has something already, of which I'm not 100% certain how well it works, but since streaming is a bit...
Hi, let's say i want to create a new request where `my_prev_response_id` is an `Option`, so i don't need to handle input myself. ```rust let request = CreateResponseArgs::default().previous_response_id(my_prev_response_id).build()?; ``` From...
Some OpenAI Compatible providers return completions without object field which causes the error. So add default value for CreateChatCompletionResponse, CreateCompletionResponse,GeminiCreateChatCompletionResponse.