OnnxStream icon indicating copy to clipboard operation
OnnxStream copied to clipboard

Lightweight inference library for ONNX files, written in C++. It can run Stable Diffusion XL 1.0 on a RPI Zero 2 (or in 298MB of RAM) but also Mistral 7B on desktops and servers. ARM, x86, WASM, RISC-...

Results 47 OnnxStream issues
Sort by recently updated
recently updated
newest added

比如yolov5现在是onnx的,想用这个节约内存,可以支持吗

Can you please make more OnnxStream models and make it have lora compatible? I want ffusionaiXL model and endjourney model

Hey if I want to use a custom model does it have to be Diffusers format? I know how to add Diffusers but I wanna know before I install it...

Hello I am currently exploring the capabilities of OnnxStream for generating images using diffusion models like Stable Diffusion, and I would like to know if it is possible to adjust...

Please create the following browser wasm demos- 1) Stable diffusion with W8A8 quantization- This is important because the stable diffusion [demo](https://intel.github.io/web-ai-showcase/) which I saw uses fp16 weights with transformers.js as...

Everything works fine until the cmake starts failing: `cmake --build . --config Release` ``` cmake -DMAX_SPEED=ON -DXNNPACK_DIR= .. cmake --build . --config Release ``` **I'm using:** Linux ubuntu 6.1.0-1023-rockchip https://github.com/vitoplantamura/OnnxStream/issues/23-Ubuntu...

Is it possible to support flux schnell in 500mb ram with this?