Thinkin3
Thinkin3
@lexasub Hi, I am just interested in how you boost the tensor split, and actually too much time wasting made me so sad, same model with 3-device cluster needs 3...
@lexasub splendidly nice! The above instructions do boost the split via rpc among Nvidia GPU devices, and besides,I tried to connect with Mac devices, which still held low efficiency to...
Hi, unlike the default path of downloaded models from huggingface, the default model path is the folder ```llama.cpp/models/``` rather than the ```~/.cache/``` thing.
> > Hi, unlike the default path of downloaded models from huggingface, the default model path is the folder `llama.cpp/models/` rather than the `~/.cache/` thing. > > thanks for replying....
sorry, ```ls```for what folder right here as you mentioned.
Hi all, actually you may find that there are several options to be downloaded for different models including all open-source deepseek models in the provided user interface if you successfully...
> > Hi all, actually you may find that there are several options to be downloaded for different models including all open-source deepseek models in the provided user interface if...
I do not know much information about the linux case since it seems that the huggingface has been providing mac os users with the mlx models of deepseek if I...
> > I do not know much information about the linux case since it seems that the huggingface has been providing mac os users with the mlx models of deepseek...
@borch84 hi currently exo has been using their pretty names for all supported models, please find and read it ```exo/models.py```. Besides, I do not know whether you are trying to...