LocalAI
LocalAI copied to clipboard
Ministral-3: Canceled desc =
LocalAI version:
3.8.0
Environment, CPU architecture, OS, and Version:
Linux fw 6.17.12-1-MANJARO #1 SMP PREEMPT_DYNAMIC Fri, 12 Dec 2025 19:42:49 +0000 x86_64 GNU/Linux
Describe the bug
I load https://huggingface.co/mistralai/Ministral-3-3B-Instruct-2512-GGUF/resolve/main/Ministral-3-3B-Instruct-2512-Q5_K_M.gguf via http://localhost:8080/import-model and just try to run a chat afterwards.
To Reproduce
Expected behavior
Logs
4:51PM INF BackendLoader starting backend=llama-cpp modelID=Ministral-3-3B-Instruct-2512-Q5_K_M.gguf o.model=Ministral-3-3B-Instruct-2512-Q5_K_M.gguf
4:51PM ERR Failed to load model Ministral-3-3B-Instruct-2512-Q5_K_M.gguf with backend llama-cpp error="failed to load model with internal loader: could not load model: rpc error: code = Canceled desc = " modelID=Ministral-3-3B-Instruct-2512-Q5_K_M.gguf
4:51PM ERR Stream ended with error: failed to load model with internal loader: could not load model: rpc error: code = Canceled desc =
4:51PM INF HTTP request method=POST path=/v1/chat/completions status=200
Additional context