bil-ash

Results 72 comments of bil-ash

@xenova Any updates? It seems to be well suited for encoder-decoder & encoder-only models, so whisper models running on browser should be the biggest beneficiary of this conversion.

The [docs](https://github.com/microsoft/onnxruntime/pull/21835) also got updated recently. @xenova Please make it available in transformers.js . Will be very helpful for encoder-decoder models like whisper and nllb.

Just a few suggestions- 1) Translate option should be available in the 3 dots(per-post) menu on each post and not tied to a user's locale. My locale might be A...

Same request here. I too would like to have gemma3n support in web-llm.

> Assign me this issue I guess you are free to open a PR and no one needs to assign you for that.

@dansup Would be nice if this feature is implemented before the next release.

@dansup Any update on this?

@xenova onnxruntime is adding 2-bit quantization [support](https://github.com/microsoft/onnxruntime/pull/25542) for av2, avx512 and neon. Also, recent emscripten versions support avx2 [intrinsics](https://emscripten.org/docs/porting/simd.html). If these 2 are combined to build onnxruntime with 2-bit quantized...

Yes, this would be a really useful feature.

Yeah, I would also like to provide users with the option to sign-up via sso-for the moment only google,facebook and twitter