zeritonius
zeritonius
@DustinBrett This worked for me, fully local and offline: https://github.com/mlc-ai/web-llm/issues/19#issuecomment-1511754031
@eatcosmos > I search by file size or file keywords(pkl model wgsl) through the whole computer drivers, In similar situations, I search by date and see the most recently created...
I will wait for that tutorial. Thank you for the info.
@Ryan-yang125 At the above location, I found the compiled file | the executable (.wasm), but not the text file | the source file (.wat). Do I need a special kind...
@howie6879 : thank you so much for the tutorial! i don't use docker, but it works on my computer with [serve](https://github.com/vercel/serve) ran into the doc folder, if I'm connected to...
@tqchen : thank you for the suggestion. If I run `jekyll serve --host localhost --baseurl /web-llm --port 8888` from inside the docs folder, I receive the same error if I'm...
@eniradi Thanks to your indications, it now works the way I wished (locally and offline). Locally = the code runs on my computer Offline = I don't need to be...
@kadogo > I didn't alter the cacheUrl, but I guess that it would work too. Does it work even if you are not connected to the Internet? I believe your...
@intoempty > I noticed that visiting the local path to /dist/vicuna-7b/vicuna-7b_webgpu.wasm in Canary directly I'm not sure what you mean by "directly", so I tried both: I opened the file...
@kohlerm: Could you, please, try the link below and tell me if you see a big red triangle or an error message? https://webgpu.github.io/webgpu-samples/samples/helloTriangle