async-openai icon indicating copy to clipboard operation
async-openai copied to clipboard

Wasm support

Open Boscop opened this issue 1 year ago • 14 comments

Thanks for making this crate, it seems very useful :)

Currently the crate always depends on e.g. tokio which means it can't be compiled to wasm for use in frontends (or serverless wasm workers like on AWS/Cloudflare) that want to make OpenAI API requests. It would be great if this crate could also be compiled to wasm.

Boscop avatar Aug 25 '23 16:08 Boscop

Thank you 😌

Given that reqwest support wasm, would like to have wasm support too.

I'm not very familar with wasm ecosystem, but seems like tokio has work in progress for it https://github.com/tokio-rs/tokio/issues/4827

Perhaps via feature flag to swtich between tokio and wasm as initial starting point to support wasm? Would love your input/ideas on implementation.

64bit avatar Aug 26 '23 01:08 64bit

Yes, tokio could be an optional feature. Wasm doesn't need the tokio runtime (and it wouldn't be desirable because of the bloat), but request works on wasm as well. Not sure which other deps are only needed outside of wasm. If there are more deps that aren't needed on wasm, you could put them all under one feature named "runtime".

Boscop avatar Aug 26 '23 11:08 Boscop

I would love to be able to implement a different async solution.

I am building a new app and was trying to make it a non async app that uses this app to make async calls by blocking with futures::executor with block_on, then I realized the tokio requirement.

I think futures-rs would be a great choice for async, as it is also compatible with no_std environments. I would love to have a no_std async access to the openai api.

https://github.com/rust-lang/futures-rs

My use case being personal devices that connect to the openAI API for voice to text, and then to chatGPT.

I think wasm devs would also appreciate using this crate

I may try and help here at some point in the near future. I am currently making a bot to make code upgrades automatically, using your library, so maybe I will point it this direction to test it out....

cosmikwolf avatar Sep 01 '23 21:09 cosmikwolf

Hi @cosmikwolf

It seems that support for different async executor should be a separate issue? Is that somehow related to WASM as well?

64bit avatar Sep 04 '23 00:09 64bit

Hello @64bit have little experience with WASM architecture but would like to pick this up in the coming week.

Doordashcon avatar Sep 05 '23 21:09 Doordashcon

Thank you @cosmikwolf and @Doordashcon for offering to contribute!

I'll let you guys coordinate on this thread.

To consider this resolved we should at least have one working example for AWS lambda or Cloudflare ( or both if you're feeling adventurous :))

64bit avatar Sep 07 '23 00:09 64bit

+1. I skimmed through the code searching for tokio, and it seems most of the usage relates to files. So, I guess probably the easiest first step is to gate file-related ops behind a feature with optional tokio dependency. Those who want to upload/download audio/images have to wait for a while, but text only functions should just work on wasm I guess?

Update: except this one (and only this one) I guess https://github.com/64bit/async-openai/blob/faaa89fb5aafe5cda85d92e45d23a4d0d0b99be5/async-openai/src/client.rs#L293

ifsheldon avatar Oct 15 '23 06:10 ifsheldon

Getting started without streaming and files support, but testable through examples would still be a good first step!

64bit avatar Oct 15 '23 07:10 64bit

Hi all! If you can help testing #120 and/or try it on wasm, it would be great.

ifsheldon avatar Oct 15 '23 12:10 ifsheldon

Updates from release notes in 0.17.0:

WASM support, it lives in experiments branch. To use it please pin directly to git sha in your Cargo.toml. Any discussion, issues, related to WASM are welcome in https://github.com/64bit/async-openai/issues/102 . Any WASM related PRs are welcome in experiments branch.

64bit avatar Nov 26 '23 05:11 64bit

I am maintaining the code for wasm support and I am trying to stabilize wasm target(s) in main. So perhaps we can discuss a more detailed plan here?

Current State

wasm32-unknown-unknown is already working. See the example openai-web-app. And to my knowledge, wasi support should just work since wasm32-unknown-unknown is the bare minimum.

Implementation Plan (not complete)

If you have something in mind, please make a comment or help out the implementation.

Tracking List:

  • Wasi example(s) on AWS/Cloudflare:
    • [ ] Compiles to wasm32-wasi target
    • [x] Cloudflare example #178

ifsheldon avatar Jan 08 '24 10:01 ifsheldon

Since I want to publish a crate that depends on WASM feature of async-openai, I need to publish WASM support first, so I'd like to publish my fork on crates.io with the name async-openai-wasm soon.

ifsheldon avatar Apr 16 '24 11:04 ifsheldon

Thank you for the heads up. I think that's a good way forward to keep it sustainable.

Do you plan to make it permanent? If so, you're welcome to link new crate in README. And so we can also close the experiments branch and remove related doc in README.

64bit avatar Apr 28 '24 22:04 64bit

Yeah, I've made a PR. Thanks!

ifsheldon avatar May 08 '24 11:05 ifsheldon

WASM support has a new home https://github.com/ifsheldon/async-openai-wasm, hence closing

64bit avatar Jun 05 '24 00:06 64bit