Robert Sinclair
Robert Sinclair
I mean without a docker container.
@mudler thanks.
> For the record, https://github.com/mlc-ai/web-llm/tree/main/examples/simple-chat-upload is an example of supporting local model in the app > For the record, https://github.com/mlc-ai/web-llm/tree/main/examples/simple-chat-upload is an example of supporting local model in the app...
You can find my quantizations (in gguf) on huggingface here: https://huggingface.co/ZeroWw if you are interested.
> Mind providing some more context about how you experienced this error? Sure.. I downloaded the executable from the release and tested it on google colab. It output a base64...
(by the way, to have a shell on google colab I develioped this: https://github.com/0wwafa/shell ) But I like the webrtc approach.
It has to do with copying and pasting because if I do it from the shell (my shell) I get: ``` have-remote-offer stable checking Answer created. Send the following answer...
According to stackoverflow: > You are using CPU only pytorch, but your code has statement like cr = nn.CrossEntropyLoss(weight=torch.tensor(classes_weights).cuda()) which is trying to move the tensor to GPU. > >...
@jpeek34556 was that supposed to be an answer to my question? I only understood "good luck" :P
well.. there should be a flag or soething to make that possible...