Results 166 comments of Charlie Ruan

Thanks for the catch; should be fixed via https://github.com/mlc-ai/web-llm/pull/371/files#diff-28c3e3e43a126ec69590b3e436ce6386f7605089d4964f95146c6bd02a84b01c

Hi! Are the models from https://github.com/mlc-ai/binary-mlc-llm-libs or did you compile the models yourself? If it is the latter case, you'd have to recompile them like here https://github.com/mlc-ai/binary-mlc-llm-libs/pull/90. If it is...

Hmm this is not really expected as I only added `apply_presence_and_frequency_penalty` in https://github.com/mlc-ai/web-llm/pull/299.

This is likely due to an old version of the web-llm npm (if you are not building from source). If you are building from source, this is likely due to...

Thanks for the request! We should be able to add the prebuilt wasm files in shortly. cc @YiyanZhai

Thanks for the list! WizardMath and OpenHermes can reuse the wasm of Mistral (as shown in `prebuiltAppConfig` in [src/config.ts](https://github.com/mlc-ai/web-llm/blob/main/src/config.ts)); CodeLlama should be able to reuse that of Llama-2, as long...

Update: StableLM 2 zerphyr 1.6b is now part of the prebuilt app config, since 0.2.39. For music, this is not intended to be part of the webLLM's prebuilt; runtime is...

Ran into the same issue when testing on a mobile device. Using localtunnel like @felladrin suggested worked for me (thanks!). Alternatively, if you are using an Android testing on Android...

Closing this issue for now, as either port forwarding or localtunnel should resolve it. Feel free to reopen if issues persist.

@DavidGOrtega Just updated the npm to 0.2.24: https://github.com/mlc-ai/web-llm/pull/323 This should fix it. Let us know if issues persist. Thank you!