BBC-Esq
BBC-Esq
> Thank you for your infos. I will test it asap, there are some problems with my GPUs now. I seems like CUDA 12 does strange things compared to CUDA...
@minhthuc2502 Sure, will do. Feel free to close whatever issues you want related to this. I haven't had time to keep up during the last week or so with the...
Interesting. So does that mean that it will no longer be open source or what not?
Oh, I'm not inherently opposed to open source for-profit stuff, it just affects to the extent to which I contribute and/or use something. I use lots of stuff that was...
I don't see the solar model within the folder containing the other models, which is the last image. That is very strange. Also, I'm not sure why there is the...
Yep, unchecking "chunks only" wouldn't change the vram usage, the model would still be loaded. But it's SUPPOSED to automatically remove the "local" model when you choose the use LM...
Does your VRAM usage roughly match what my release page says it should be for the various models? Are you still unable to download the SOLAR model? Any more details?
Feel free to reopen if this issue persists.
If I have a local LLM that is being run on a server like localhost, where would I modify to add my specific server information so it connects to it...
Also, any chance you can share some screenshots?