async-openai
async-openai copied to clipboard
Wasm support
Thanks for making this crate, it seems very useful :)
Currently the crate always depends on e.g. tokio which means it can't be compiled to wasm for use in frontends (or serverless wasm workers like on AWS/Cloudflare) that want to make OpenAI API requests. It would be great if this crate could also be compiled to wasm.
Thank you 😌
Given that reqwest
support wasm, would like to have wasm support too.
I'm not very familar with wasm ecosystem, but seems like tokio has work in progress for it https://github.com/tokio-rs/tokio/issues/4827
Perhaps via feature flag to swtich between tokio and wasm as initial starting point to support wasm? Would love your input/ideas on implementation.
Yes, tokio could be an optional feature. Wasm doesn't need the tokio runtime (and it wouldn't be desirable because of the bloat), but request works on wasm as well. Not sure which other deps are only needed outside of wasm. If there are more deps that aren't needed on wasm, you could put them all under one feature named "runtime".
I would love to be able to implement a different async solution.
I am building a new app and was trying to make it a non async app that uses this app to make async calls by blocking with futures::executor with block_on, then I realized the tokio requirement.
I think futures-rs would be a great choice for async, as it is also compatible with no_std environments. I would love to have a no_std async access to the openai api.
https://github.com/rust-lang/futures-rs
My use case being personal devices that connect to the openAI API for voice to text, and then to chatGPT.
I think wasm devs would also appreciate using this crate
I may try and help here at some point in the near future. I am currently making a bot to make code upgrades automatically, using your library, so maybe I will point it this direction to test it out....
Hi @cosmikwolf
It seems that support for different async executor should be a separate issue? Is that somehow related to WASM as well?
Hello @64bit have little experience with WASM architecture but would like to pick this up in the coming week.
Thank you @cosmikwolf and @Doordashcon for offering to contribute!
I'll let you guys coordinate on this thread.
To consider this resolved we should at least have one working example for AWS lambda or Cloudflare ( or both if you're feeling adventurous :))
+1. I skimmed through the code searching for tokio
, and it seems most of the usage relates to files. So, I guess probably the easiest first step is to gate file-related ops behind a feature with optional tokio dependency. Those who want to upload/download audio/images have to wait for a while, but text only functions should just work on wasm I guess?
Update: except this one (and only this one) I guess https://github.com/64bit/async-openai/blob/faaa89fb5aafe5cda85d92e45d23a4d0d0b99be5/async-openai/src/client.rs#L293
Getting started without streaming and files support, but testable through examples would still be a good first step!
Hi all! If you can help testing #120 and/or try it on wasm, it would be great.
Updates from release notes in 0.17.0:
WASM support, it lives in experiments branch. To use it please pin directly to git sha in your Cargo.toml. Any discussion, issues, related to WASM are welcome in https://github.com/64bit/async-openai/issues/102 . Any WASM related PRs are welcome in experiments branch.
I am maintaining the code for wasm support and I am trying to stabilize wasm target(s) in main. So perhaps we can discuss a more detailed plan here?
Current State
wasm32-unknown-unknown
is already working. See the example openai-web-app
. And to my knowledge, wasi support should just work since wasm32-unknown-unknown
is the bare minimum.
Implementation Plan (not complete)
If you have something in mind, please make a comment or help out the implementation.
Tracking List:
- Wasi example(s) on AWS/Cloudflare:
- [ ] Compiles to
wasm32-wasi
target - [x] Cloudflare example #178
- [ ] Compiles to
Since I want to publish a crate that depends on WASM feature of async-openai
, I need to publish WASM support first, so I'd like to publish my fork on crates.io with the name async-openai-wasm
soon.
Thank you for the heads up. I think that's a good way forward to keep it sustainable.
Do you plan to make it permanent? If so, you're welcome to link new crate in README. And so we can also close the experiments branch and remove related doc in README.
Yeah, I've made a PR. Thanks!
WASM support has a new home https://github.com/ifsheldon/async-openai-wasm, hence closing