PawelSzpyt

Results 4 comments of PawelSzpyt
trafficstars

Well, 4060 TI is cheap and can have 16GB VRAM, so it comes to mind on first place. I was planning on buying a laptop with 3080 and 16GB VRAM,...

I confirm, I downloaded full model directly from Meta and I can't convert it with llama.cpp to gguf. I'm on Mac, but I guess it doesn't really matter. Tried convert...

I did try the latest version, It does not work on my MacOS 14.4.1. It works fine, without any problems and very fast if you downloaded the weights from Huggingface....