web-llm
web-llm copied to clipboard
Cache.add() encountered a network error
I'm seeing a Cache.add() error when trying to load Llama-3-8B-Instruct-q4f16_1-MLC:
Error: Cannot fetch https://huggingface.co/mlc-ai/Llama-3-8B-Instruct-q4f16_1-MLC/resolve/main/params_shard_0.bin err= NetworkError: Failed to execute 'add' on 'Cache': Cache.add() encountered a network error
This is Chrome 124.0.6367.119, running the demo at https://webllm.mlc.ai/
It looks like any model I haven't previously cached is reporting the error:
I don't think this is related to the typical browser cache, but just for good measure I cleared my cache, but same issue.
@thekevinscott I'm facing the same issue on chrome v124
i cleared my cache but no improvement
cc: @CharlieFRuan
Is this issue encountered for all models? To be honest the Cache.add() is a bit vague; I've encountered this when the URL is wrong, but it does not seem to be the case here.
To triage a bit, could you check, in the console, application -> Cache storage, and see if webllm/config or webllm/wasm are populated? Just wanted to see if this is weight-downloading specific, or also applies to config json file and the wasm.
Besides, is this encountered for all models? Perhaps could you try, say TinyLlama? Wanted to know if it is due to a single weight shard being too large.
@CharlieFRuan I've just tried again (same Chrome version) and, lo and behold, it appears to be working. Perhaps there was a new deploy of webllm in the meantime?
I see entries under Cache storage, but since it now appears to be working, not sure if that's helpful. I wonder if @ucalyptus2 is still seeing the issue.
Hi, @tqchen
I got the same issue here, althought it was mysteriously work on my mobile, maybe I dont have enough space on my pc ? > 5GB tested on chrome 128 and Arc Browser
Here is my cache:
I've just been running into this and then I looked at free disk space on my mac and it was 1.8GB!
I removed about 30GB of files and now everything seems good!