Download, save and load LLMs from external drive
Hello, thank you for this great application! I wanted to ask whether it would be possible to save all downloaded LLMs on an external / secondary internal drive, instead of the default location.
LLMs are massive in size, so Alpaca could definitely benefit from a custom location option in the Preferences.
compatibility of using ollama cmd would be nice to know, pulling them from there if existing would be need too. i'm currently using the ollama api for vscoding support since the download of Alpaca of the llama3.1 405b got stuck at 75% :1st_place_medal:
INFO [connection_handler.py | start] client version is 0.3.9
INFO [connection_handler.py | request] GET : http://127.0.0.1:11435/api/tags
Gdk-Message: 20:27:50.308: Error 71 (Protocol error) dispatching to Wayland display.
I would like this feature, too. My home directory is on a drive that doesn't have enough free space for the larger LLMs. I have another drive with plenty of space.
I have some news for the 3.0.0 release of Alpaca
Now you can select a custom directory to load and download models, it doesn't transfer models from the default directory tho
https://github.com/Jeffser/Alpaca/commit/e450547bea2b184ba44a25f4a7354b02689bdb7f