ollama
ollama copied to clipboard
Ollama does not list installed models
What is the issue?
The command "ollama list" does not list the installed models on the system (at least those created from a local GGUF file), which prevents other utilities (for example, WebUI) from discovering them.
However, the models are there and can be invoked by specifying their name explicitly. For example: "ollama run MyModel".
OS
Linux
GPU
Nvidia
CPU
Intel
Ollama version
0.1.35
Have same issue. All models customized by modelfile are not showing by ollama list. But can still be found by ollama show --modelfile
The problem persist in 0.1.36
yeah, me to... also it shows models I tried 2 months ago and long time ago deleted...
client version is 0.1.36
after restarting ollama it went back to normal. for me solved
Hi can you provide server logs for when you run ollama list
?
Log after the command "ollama list":
may 11 13:35:04 dellypop ollama[1320]: [GIN] 2024/05/11 - 13:35:04 | 200 | 22.625µs | 127.0.0.1 | HEAD "/" may 11 13:35:04 dellypop ollama[1320]: time=2024-05-11T13:35:04.315-06:00 level=WARN source=routes.go:749 msg="bad manifest" name=registry.ollama.ai/library/Llama-3-CoderV2-Fast:latest error="open /usr/share/ollama/.ollama/models/manifests/registry.ollama.ai/library/llama-3-coderv2-fast/latest: no such file or directory" may 11 13:35:04 dellypop ollama[1320]: [GIN] 2024/05/11 - 13:35:04 | 200 | 167.327µs | 127.0.0.1 | GET "/api/tags"
I only have one model called 'Llama-3-CoderV2-Fast', which was created with the command 'ollama create Llama-3-CoderV2-Fast -f Modelfile'. I notice that the capitalization is not being respected in the name of the directory "/usr/share/ollama/.ollama/models/manifests/registry.ollama.ai/library/Llama-3-CoderV2-Fast"
will reopen this until it's addressed in a release
I use the main branch build, but I can't find this problem.
- Modelfile-gguf
FROM ./Phi-3-mini-4k-instruct-q4.gguf
This has been fixed and release in 0.1.37