remmen-io

Results 2 comments of remmen-io

Simply download your model to a location (here `models-cache`) and set `HUGGINGFACE_HUB_CACHE=/models-cache` and `MODEL_ID=/models-cache/deepseek-coder-33B-instruct-AWQ`

I've tried to setup the localai to point to a local endpoint made with huggingface tgi. ``` k8sgpt auth update localai --model tgi --baseurl https://deepseek.k8scluster.ch/v1 ``` but I get a...