anythingllm-docs
anythingllm-docs copied to clipboard
[FAQ] Ollama not showing Chat Models on LLM preferences
On our discord people often say I visited http://localhost:11434/ and it is showing ollama is running but anythingllm is not showing ollama models.
The issue here is they entered http://localhost:11434/ on anythingllm instead of entering the http://127.0.0.1:11434/ we have mentioned this on our docs https://docs.useanything.com/anythingllm-setup/llm-configuration/local/ollama but it is on the page which is inside of multiple folder -- Setup > LLM Config > Local > Ollama.
New users might not search inside folders so it is good to have a page on FAQ section about this