async-openai
async-openai copied to clipboard
Rust library for OpenAI
Error: failed to deserialize api response: expected value at line 1 column 1 The error jumps up when I tried to download a image from code-interpreter with retrieve_content. I checked...
I wish to retain control over the `CreateChatCompletionRequest` for logging and caching purposes after the call is made. Since the `create` function doesn't actually need to own any of the...
Given that officially Chat completion API is compatibale - this library should work without any changes. Hence it would be great to have a fully self contained example which works...
**Background** This project started around Nov 2022 because I did not find any production ready Rust libraries for OpenAI covering all APIs. And my non-comprehensive search back then did not...
I am trying to understand how to upload a file with contents from memory and can't figure out how. It appears that file input is only from a file? Do...
let job = open_ai_client .fine_tuning() .create(fine_tune_request) .await?; FineTuningJob struct is like: { "object": "fine_tuning.job", "id": "ftjob xxx", "model": "gpt-3.5-turbo-0613", "created_at": 1700626516, "finished_at": null, "fine_tuned_model": null, "organization_id": "org-zxxx", "result_files": [], "status":...
I am adding support for Cognitive Search as data source in Azure OpenAI Service.
Addresses https://github.com/64bit/async-openai/issues/74 and adds support for Embedding Deserialization. This also allows for streaming completion to fail gracefully on any Stream ended event as supposed to hanging until a timeout.
Added the `strict : ` option for function calls to enable structured output. See [Structured Output](https://openai.com/index/introducing-structured-outputs-in-the-api/)